Average Error: 0.1 → 0.1
Time: 25.3s
Precision: 64
\[0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
\[\mathsf{fma}\left(-{x}^{3}, 0.12900613773279798, {x}^{3} \cdot 0.12900613773279798\right) + \mathsf{fma}\left(0.954929658551372, x, \left(-0.12900613773279798\right) \cdot {x}^{3}\right)\]
0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\left(x \cdot x\right) \cdot x\right)
\mathsf{fma}\left(-{x}^{3}, 0.12900613773279798, {x}^{3} \cdot 0.12900613773279798\right) + \mathsf{fma}\left(0.954929658551372, x, \left(-0.12900613773279798\right) \cdot {x}^{3}\right)
double f(double x) {
        double r1272048 = 0.954929658551372;
        double r1272049 = x;
        double r1272050 = r1272048 * r1272049;
        double r1272051 = 0.12900613773279798;
        double r1272052 = r1272049 * r1272049;
        double r1272053 = r1272052 * r1272049;
        double r1272054 = r1272051 * r1272053;
        double r1272055 = r1272050 - r1272054;
        return r1272055;
}

double f(double x) {
        double r1272056 = x;
        double r1272057 = 3.0;
        double r1272058 = pow(r1272056, r1272057);
        double r1272059 = -r1272058;
        double r1272060 = 0.12900613773279798;
        double r1272061 = r1272058 * r1272060;
        double r1272062 = fma(r1272059, r1272060, r1272061);
        double r1272063 = 0.954929658551372;
        double r1272064 = -r1272060;
        double r1272065 = r1272064 * r1272058;
        double r1272066 = fma(r1272063, r1272056, r1272065);
        double r1272067 = r1272062 + r1272066;
        return r1272067;
}

Error

Bits error versus x

Derivation

  1. Initial program 0.1

    \[0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
  2. Using strategy rm
  3. Applied pow10.1

    \[\leadsto 0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\left(x \cdot x\right) \cdot \color{blue}{{x}^{1}}\right)\]
  4. Applied pow20.1

    \[\leadsto 0.954929658551372 \cdot x - 0.12900613773279798 \cdot \left(\color{blue}{{x}^{2}} \cdot {x}^{1}\right)\]
  5. Applied pow-prod-up0.1

    \[\leadsto 0.954929658551372 \cdot x - 0.12900613773279798 \cdot \color{blue}{{x}^{\left(2 + 1\right)}}\]
  6. Simplified0.1

    \[\leadsto 0.954929658551372 \cdot x - 0.12900613773279798 \cdot {x}^{\color{blue}{3}}\]
  7. Using strategy rm
  8. Applied prod-diff0.1

    \[\leadsto \color{blue}{\mathsf{fma}\left(0.954929658551372, x, -{x}^{3} \cdot 0.12900613773279798\right) + \mathsf{fma}\left(-{x}^{3}, 0.12900613773279798, {x}^{3} \cdot 0.12900613773279798\right)}\]
  9. Final simplification0.1

    \[\leadsto \mathsf{fma}\left(-{x}^{3}, 0.12900613773279798, {x}^{3} \cdot 0.12900613773279798\right) + \mathsf{fma}\left(0.954929658551372, x, \left(-0.12900613773279798\right) \cdot {x}^{3}\right)\]

Reproduce

herbie shell --seed 2019139 +o rules:numerics
(FPCore (x)
  :name "Rosa's Benchmark"
  (- (* 0.954929658551372 x) (* 0.12900613773279798 (* (* x x) x))))