?

Average Accuracy: 99.6% → 99.7%
Time: 8.7s
Precision: binary64
Cost: 13184

?

\[0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\left(x \cdot x\right) \cdot x\right) \]
\[\mathsf{fma}\left(0.954929658551372, x, {x}^{3} \cdot -0.12900613773279798\right) \]
(FPCore (x)
 :precision binary64
 (- (* 0.954929658551372 x) (* 0.12900613773279798 (* (* x x) x))))
(FPCore (x)
 :precision binary64
 (fma 0.954929658551372 x (* (pow x 3.0) -0.12900613773279798)))
double code(double x) {
	return (0.954929658551372 * x) - (0.12900613773279798 * ((x * x) * x));
}
double code(double x) {
	return fma(0.954929658551372, x, (pow(x, 3.0) * -0.12900613773279798));
}
function code(x)
	return Float64(Float64(0.954929658551372 * x) - Float64(0.12900613773279798 * Float64(Float64(x * x) * x)))
end
function code(x)
	return fma(0.954929658551372, x, Float64((x ^ 3.0) * -0.12900613773279798))
end
code[x_] := N[(N[(0.954929658551372 * x), $MachinePrecision] - N[(0.12900613773279798 * N[(N[(x * x), $MachinePrecision] * x), $MachinePrecision]), $MachinePrecision]), $MachinePrecision]
code[x_] := N[(0.954929658551372 * x + N[(N[Power[x, 3.0], $MachinePrecision] * -0.12900613773279798), $MachinePrecision]), $MachinePrecision]
0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\left(x \cdot x\right) \cdot x\right)
\mathsf{fma}\left(0.954929658551372, x, {x}^{3} \cdot -0.12900613773279798\right)

Error?

Derivation?

  1. Initial program 99.6%

    \[0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\left(x \cdot x\right) \cdot x\right) \]
  2. Simplified99.7%

    \[\leadsto \color{blue}{\mathsf{fma}\left(0.954929658551372, x, {x}^{3} \cdot -0.12900613773279798\right)} \]
    Proof

    [Start]99.6

    \[ 0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\left(x \cdot x\right) \cdot x\right) \]

    fma-neg [=>]99.6

    \[ \color{blue}{\mathsf{fma}\left(0.954929658551372, x, -0.12900613773279798 \cdot \left(\left(x \cdot x\right) \cdot x\right)\right)} \]

    distribute-lft-neg-in [=>]99.6

    \[ \mathsf{fma}\left(0.954929658551372, x, \color{blue}{\left(-0.12900613773279798\right) \cdot \left(\left(x \cdot x\right) \cdot x\right)}\right) \]

    *-commutative [=>]99.6

    \[ \mathsf{fma}\left(0.954929658551372, x, \color{blue}{\left(\left(x \cdot x\right) \cdot x\right) \cdot \left(-0.12900613773279798\right)}\right) \]

    unpow3 [<=]99.7

    \[ \mathsf{fma}\left(0.954929658551372, x, \color{blue}{{x}^{3}} \cdot \left(-0.12900613773279798\right)\right) \]

    metadata-eval [=>]99.7

    \[ \mathsf{fma}\left(0.954929658551372, x, {x}^{3} \cdot \color{blue}{-0.12900613773279798}\right) \]
  3. Final simplification99.7%

    \[\leadsto \mathsf{fma}\left(0.954929658551372, x, {x}^{3} \cdot -0.12900613773279798\right) \]

Alternatives

Alternative 1
Accuracy99.7%
Cost6848
\[x \cdot \mathsf{fma}\left(x \cdot x, -0.12900613773279798, 0.954929658551372\right) \]
Alternative 2
Accuracy98.1%
Cost713
\[\begin{array}{l} \mathbf{if}\;x \leq -2.7 \lor \neg \left(x \leq 2.8\right):\\ \;\;\;\;x \cdot \left(-0.12900613773279798 \cdot \left(x \cdot x\right)\right)\\ \mathbf{else}:\\ \;\;\;\;0.954929658551372 \cdot x\\ \end{array} \]
Alternative 3
Accuracy99.7%
Cost576
\[x \cdot \left(0.954929658551372 + -0.12900613773279798 \cdot \left(x \cdot x\right)\right) \]
Alternative 4
Accuracy4.4%
Cost192
\[x \cdot -0.954929658551372 \]
Alternative 5
Accuracy75.0%
Cost192
\[0.954929658551372 \cdot x \]

Error

Reproduce?

herbie shell --seed 2023133 
(FPCore (x)
  :name "Rosa's Benchmark"
  :precision binary64
  (- (* 0.954929658551372 x) (* 0.12900613773279798 (* (* x x) x))))