Average Error: 0.1 → 0.1
Time: 13.3s
Precision: 64
\[0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
\[\mathsf{fma}\left(-{x}^{3}, 0.12900613773279798, {x}^{3} \cdot 0.12900613773279798\right) + \mathsf{fma}\left(0.954929658551372, x, \left(-0.12900613773279798\right) \cdot {x}^{3}\right)\]
0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\left(x \cdot x\right) \cdot x\right)
\mathsf{fma}\left(-{x}^{3}, 0.12900613773279798, {x}^{3} \cdot 0.12900613773279798\right) + \mathsf{fma}\left(0.954929658551372, x, \left(-0.12900613773279798\right) \cdot {x}^{3}\right)
double f(double x) {
        double r348277 = 0.954929658551372;
        double r348278 = x;
        double r348279 = r348277 * r348278;
        double r348280 = 0.12900613773279798;
        double r348281 = r348278 * r348278;
        double r348282 = r348281 * r348278;
        double r348283 = r348280 * r348282;
        double r348284 = r348279 - r348283;
        return r348284;
}

double f(double x) {
        double r348285 = x;
        double r348286 = 3.0;
        double r348287 = pow(r348285, r348286);
        double r348288 = -r348287;
        double r348289 = 0.12900613773279798;
        double r348290 = r348287 * r348289;
        double r348291 = fma(r348288, r348289, r348290);
        double r348292 = 0.954929658551372;
        double r348293 = -r348289;
        double r348294 = r348293 * r348287;
        double r348295 = fma(r348292, r348285, r348294);
        double r348296 = r348291 + r348295;
        return r348296;
}

Error

Bits error versus x

Derivation

  1. Initial program 0.1

    \[0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
  2. Using strategy rm
  3. Applied pow10.1

    \[\leadsto 0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\left(x \cdot x\right) \cdot \color{blue}{{x}^{1}}\right)\]
  4. Applied pow10.1

    \[\leadsto 0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\left(x \cdot \color{blue}{{x}^{1}}\right) \cdot {x}^{1}\right)\]
  5. Applied pow10.1

    \[\leadsto 0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\left(\color{blue}{{x}^{1}} \cdot {x}^{1}\right) \cdot {x}^{1}\right)\]
  6. Applied pow-prod-up0.1

    \[\leadsto 0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\color{blue}{{x}^{\left(1 + 1\right)}} \cdot {x}^{1}\right)\]
  7. Applied pow-prod-up0.1

    \[\leadsto 0.954929658551372 \cdot x - 0.12900613773279798 \cdot \color{blue}{{x}^{\left(\left(1 + 1\right) + 1\right)}}\]
  8. Simplified0.1

    \[\leadsto 0.954929658551372 \cdot x - 0.12900613773279798 \cdot {x}^{\color{blue}{3}}\]
  9. Using strategy rm
  10. Applied prod-diff0.1

    \[\leadsto \color{blue}{\mathsf{fma}\left(0.954929658551372, x, -{x}^{3} \cdot 0.12900613773279798\right) + \mathsf{fma}\left(-{x}^{3}, 0.12900613773279798, {x}^{3} \cdot 0.12900613773279798\right)}\]
  11. Final simplification0.1

    \[\leadsto \mathsf{fma}\left(-{x}^{3}, 0.12900613773279798, {x}^{3} \cdot 0.12900613773279798\right) + \mathsf{fma}\left(0.954929658551372, x, \left(-0.12900613773279798\right) \cdot {x}^{3}\right)\]

Reproduce

herbie shell --seed 2019155 +o rules:numerics
(FPCore (x)
  :name "Rosa's Benchmark"
  (- (* 0.954929658551372 x) (* 0.12900613773279798 (* (* x x) x))))