Average Error: 0.1 → 0.1
Time: 2.3s
Precision: 64
\[0.9549296585513720181381813745247200131416 \cdot x - 0.1290061377327979819096270830414141528308 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
\[\mathsf{fma}\left(x, 0.9549296585513720181381813745247200131416, -0.1290061377327979819096270830414141528308 \cdot {x}^{3}\right)\]
0.9549296585513720181381813745247200131416 \cdot x - 0.1290061377327979819096270830414141528308 \cdot \left(\left(x \cdot x\right) \cdot x\right)
\mathsf{fma}\left(x, 0.9549296585513720181381813745247200131416, -0.1290061377327979819096270830414141528308 \cdot {x}^{3}\right)
double f(double x) {
        double r18224 = 0.954929658551372;
        double r18225 = x;
        double r18226 = r18224 * r18225;
        double r18227 = 0.12900613773279798;
        double r18228 = r18225 * r18225;
        double r18229 = r18228 * r18225;
        double r18230 = r18227 * r18229;
        double r18231 = r18226 - r18230;
        return r18231;
}

double f(double x) {
        double r18232 = x;
        double r18233 = 0.954929658551372;
        double r18234 = 0.12900613773279798;
        double r18235 = 3.0;
        double r18236 = pow(r18232, r18235);
        double r18237 = r18234 * r18236;
        double r18238 = -r18237;
        double r18239 = fma(r18232, r18233, r18238);
        return r18239;
}

Error

Bits error versus x

Derivation

  1. Initial program 0.1

    \[0.9549296585513720181381813745247200131416 \cdot x - 0.1290061377327979819096270830414141528308 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
  2. Simplified0.1

    \[\leadsto \color{blue}{x \cdot \left(0.9549296585513720181381813745247200131416 - 0.1290061377327979819096270830414141528308 \cdot \left(x \cdot x\right)\right)}\]
  3. Using strategy rm
  4. Applied sub-neg0.1

    \[\leadsto x \cdot \color{blue}{\left(0.9549296585513720181381813745247200131416 + \left(-0.1290061377327979819096270830414141528308 \cdot \left(x \cdot x\right)\right)\right)}\]
  5. Applied distribute-lft-in0.1

    \[\leadsto \color{blue}{x \cdot 0.9549296585513720181381813745247200131416 + x \cdot \left(-0.1290061377327979819096270830414141528308 \cdot \left(x \cdot x\right)\right)}\]
  6. Simplified0.1

    \[\leadsto x \cdot 0.9549296585513720181381813745247200131416 + \color{blue}{\left(-0.1290061377327979819096270830414141528308 \cdot {x}^{3}\right)}\]
  7. Using strategy rm
  8. Applied fma-def0.1

    \[\leadsto \color{blue}{\mathsf{fma}\left(x, 0.9549296585513720181381813745247200131416, -0.1290061377327979819096270830414141528308 \cdot {x}^{3}\right)}\]
  9. Final simplification0.1

    \[\leadsto \mathsf{fma}\left(x, 0.9549296585513720181381813745247200131416, -0.1290061377327979819096270830414141528308 \cdot {x}^{3}\right)\]

Reproduce

herbie shell --seed 2019356 +o rules:numerics
(FPCore (x)
  :name "Rosa's Benchmark"
  :precision binary64
  (- (* 0.954929658551372 x) (* 0.12900613773279798 (* (* x x) x))))