Average Error: 0.3 → 0.3
Time: 2.2s
Precision: binary64
\[0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\left(x \cdot x\right) \cdot x\right) \]
\[\mathsf{fma}\left(-0.12900613773279798, {x}^{3}, x \cdot 0.954929658551372\right) \]
0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\left(x \cdot x\right) \cdot x\right)
\mathsf{fma}\left(-0.12900613773279798, {x}^{3}, x \cdot 0.954929658551372\right)
(FPCore (x)
 :precision binary64
 (- (* 0.954929658551372 x) (* 0.12900613773279798 (* (* x x) x))))
(FPCore (x)
 :precision binary64
 (fma -0.12900613773279798 (pow x 3.0) (* x 0.954929658551372)))
double code(double x) {
	return (0.954929658551372 * x) - (0.12900613773279798 * ((x * x) * x));
}
double code(double x) {
	return fma(-0.12900613773279798, pow(x, 3.0), (x * 0.954929658551372));
}

Error

Bits error versus x

Derivation

  1. Initial program 0.3

    \[0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\left(x \cdot x\right) \cdot x\right) \]
  2. Applied fma-neg_binary640.3

    \[\leadsto \color{blue}{\mathsf{fma}\left(0.954929658551372, x, -0.12900613773279798 \cdot \left(\left(x \cdot x\right) \cdot x\right)\right)} \]
  3. Simplified0.3

    \[\leadsto \mathsf{fma}\left(0.954929658551372, x, \color{blue}{{x}^{3} \cdot -0.12900613773279798}\right) \]
  4. Taylor expanded in x around 0 0.3

    \[\leadsto \color{blue}{0.954929658551372 \cdot x - 0.12900613773279798 \cdot {x}^{3}} \]
  5. Simplified0.3

    \[\leadsto \color{blue}{\mathsf{fma}\left(-0.12900613773279798, {x}^{3}, x \cdot 0.954929658551372\right)} \]
  6. Final simplification0.3

    \[\leadsto \mathsf{fma}\left(-0.12900613773279798, {x}^{3}, x \cdot 0.954929658551372\right) \]

Reproduce

herbie shell --seed 2022067 
(FPCore (x)
  :name "Rosa's Benchmark"
  :precision binary64
  (- (* 0.954929658551372 x) (* 0.12900613773279798 (* (* x x) x))))