Average Error: 0.1 → 0.1
Time: 2.6s
Precision: 64
\[0.9549296585513720181381813745247200131416 \cdot x - 0.1290061377327979819096270830414141528308 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
\[0.9549296585513720181381813745247200131416 \cdot x - 0.1290061377327979819096270830414141528308 \cdot {x}^{3}\]
0.9549296585513720181381813745247200131416 \cdot x - 0.1290061377327979819096270830414141528308 \cdot \left(\left(x \cdot x\right) \cdot x\right)
0.9549296585513720181381813745247200131416 \cdot x - 0.1290061377327979819096270830414141528308 \cdot {x}^{3}
double f(double x) {
        double r31485 = 0.954929658551372;
        double r31486 = x;
        double r31487 = r31485 * r31486;
        double r31488 = 0.12900613773279798;
        double r31489 = r31486 * r31486;
        double r31490 = r31489 * r31486;
        double r31491 = r31488 * r31490;
        double r31492 = r31487 - r31491;
        return r31492;
}

double f(double x) {
        double r31493 = 0.954929658551372;
        double r31494 = x;
        double r31495 = r31493 * r31494;
        double r31496 = 0.12900613773279798;
        double r31497 = 3.0;
        double r31498 = pow(r31494, r31497);
        double r31499 = r31496 * r31498;
        double r31500 = r31495 - r31499;
        return r31500;
}

Error

Bits error versus x

Try it out

Your Program's Arguments

Results

Enter valid numbers for all inputs

Derivation

  1. Initial program 0.1

    \[0.9549296585513720181381813745247200131416 \cdot x - 0.1290061377327979819096270830414141528308 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
  2. Using strategy rm
  3. Applied pow10.1

    \[\leadsto 0.9549296585513720181381813745247200131416 \cdot x - 0.1290061377327979819096270830414141528308 \cdot \left(\left(x \cdot x\right) \cdot \color{blue}{{x}^{1}}\right)\]
  4. Applied pow10.1

    \[\leadsto 0.9549296585513720181381813745247200131416 \cdot x - 0.1290061377327979819096270830414141528308 \cdot \left(\left(x \cdot \color{blue}{{x}^{1}}\right) \cdot {x}^{1}\right)\]
  5. Applied pow10.1

    \[\leadsto 0.9549296585513720181381813745247200131416 \cdot x - 0.1290061377327979819096270830414141528308 \cdot \left(\left(\color{blue}{{x}^{1}} \cdot {x}^{1}\right) \cdot {x}^{1}\right)\]
  6. Applied pow-prod-up0.1

    \[\leadsto 0.9549296585513720181381813745247200131416 \cdot x - 0.1290061377327979819096270830414141528308 \cdot \left(\color{blue}{{x}^{\left(1 + 1\right)}} \cdot {x}^{1}\right)\]
  7. Applied pow-prod-up0.1

    \[\leadsto 0.9549296585513720181381813745247200131416 \cdot x - 0.1290061377327979819096270830414141528308 \cdot \color{blue}{{x}^{\left(\left(1 + 1\right) + 1\right)}}\]
  8. Simplified0.1

    \[\leadsto 0.9549296585513720181381813745247200131416 \cdot x - 0.1290061377327979819096270830414141528308 \cdot {x}^{\color{blue}{3}}\]
  9. Final simplification0.1

    \[\leadsto 0.9549296585513720181381813745247200131416 \cdot x - 0.1290061377327979819096270830414141528308 \cdot {x}^{3}\]

Reproduce

herbie shell --seed 2019318 
(FPCore (x)
  :name "Rosa's Benchmark"
  :precision binary64
  (- (* 0.95492965855137202 x) (* 0.129006137732797982 (* (* x x) x))))