?

Average Accuracy: 100.0% → 100.0%
Time: 974.0ms
Precision: binary64
Cost: 6720

?

\[\left(x \cdot x\right) \cdot 2 - 1 \]
\[\mathsf{fma}\left(x, x \cdot 2, -1\right) \]
(FPCore (x) :precision binary64 (- (* (* x x) 2.0) 1.0))
(FPCore (x) :precision binary64 (fma x (* x 2.0) -1.0))
double code(double x) {
	return ((x * x) * 2.0) - 1.0;
}
double code(double x) {
	return fma(x, (x * 2.0), -1.0);
}
function code(x)
	return Float64(Float64(Float64(x * x) * 2.0) - 1.0)
end
function code(x)
	return fma(x, Float64(x * 2.0), -1.0)
end
code[x_] := N[(N[(N[(x * x), $MachinePrecision] * 2.0), $MachinePrecision] - 1.0), $MachinePrecision]
code[x_] := N[(x * N[(x * 2.0), $MachinePrecision] + -1.0), $MachinePrecision]
\left(x \cdot x\right) \cdot 2 - 1
\mathsf{fma}\left(x, x \cdot 2, -1\right)

Error?

Derivation?

  1. Initial program 100.0%

    \[\left(x \cdot x\right) \cdot 2 - 1 \]
  2. Simplified100.0%

    \[\leadsto \color{blue}{\mathsf{fma}\left(x, x \cdot 2, -1\right)} \]
    Proof

    [Start]100.0

    \[ \left(x \cdot x\right) \cdot 2 - 1 \]

    associate-*l* [=>]100.0

    \[ \color{blue}{x \cdot \left(x \cdot 2\right)} - 1 \]

    fma-neg [=>]100.0

    \[ \color{blue}{\mathsf{fma}\left(x, x \cdot 2, -1\right)} \]

    metadata-eval [=>]100.0

    \[ \mathsf{fma}\left(x, x \cdot 2, \color{blue}{-1}\right) \]
  3. Final simplification100.0%

    \[\leadsto \mathsf{fma}\left(x, x \cdot 2, -1\right) \]

Alternatives

Alternative 1
Accuracy100.0%
Cost448
\[-1 + 2 \cdot \left(x \cdot x\right) \]
Alternative 2
Accuracy66.3%
Cost64
\[-1 \]

Error

Reproduce?

herbie shell --seed 2023140 
(FPCore (x)
  :name "Numeric.SpecFunctions:logGammaCorrection from math-functions-0.1.5.2"
  :precision binary64
  (- (* (* x x) 2.0) 1.0))