Average Error: 0.1 → 0.1
Time: 2.2s
Precision: 64
\[0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
\[0.95492965855137202 \cdot x - 0.129006137732797982 \cdot {\left({x}^{3}\right)}^{1}\]
0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)
0.95492965855137202 \cdot x - 0.129006137732797982 \cdot {\left({x}^{3}\right)}^{1}
double f(double x) {
        double r16996 = 0.954929658551372;
        double r16997 = x;
        double r16998 = r16996 * r16997;
        double r16999 = 0.12900613773279798;
        double r17000 = r16997 * r16997;
        double r17001 = r17000 * r16997;
        double r17002 = r16999 * r17001;
        double r17003 = r16998 - r17002;
        return r17003;
}

double f(double x) {
        double r17004 = 0.954929658551372;
        double r17005 = x;
        double r17006 = r17004 * r17005;
        double r17007 = 0.12900613773279798;
        double r17008 = 3.0;
        double r17009 = pow(r17005, r17008);
        double r17010 = 1.0;
        double r17011 = pow(r17009, r17010);
        double r17012 = r17007 * r17011;
        double r17013 = r17006 - r17012;
        return r17013;
}

Error

Bits error versus x

Try it out

Your Program's Arguments

Results

Enter valid numbers for all inputs

Derivation

  1. Initial program 0.1

    \[0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
  2. Using strategy rm
  3. Applied pow10.1

    \[\leadsto 0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot \color{blue}{{x}^{1}}\right)\]
  4. Applied pow10.1

    \[\leadsto 0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot \color{blue}{{x}^{1}}\right) \cdot {x}^{1}\right)\]
  5. Applied pow10.1

    \[\leadsto 0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(\color{blue}{{x}^{1}} \cdot {x}^{1}\right) \cdot {x}^{1}\right)\]
  6. Applied pow-prod-down0.1

    \[\leadsto 0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\color{blue}{{\left(x \cdot x\right)}^{1}} \cdot {x}^{1}\right)\]
  7. Applied pow-prod-down0.1

    \[\leadsto 0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \color{blue}{{\left(\left(x \cdot x\right) \cdot x\right)}^{1}}\]
  8. Simplified0.1

    \[\leadsto 0.95492965855137202 \cdot x - 0.129006137732797982 \cdot {\color{blue}{\left({x}^{3}\right)}}^{1}\]
  9. Final simplification0.1

    \[\leadsto 0.95492965855137202 \cdot x - 0.129006137732797982 \cdot {\left({x}^{3}\right)}^{1}\]

Reproduce

herbie shell --seed 2020059 
(FPCore (x)
  :name "Rosa's Benchmark"
  :precision binary64
  (- (* 0.954929658551372 x) (* 0.12900613773279798 (* (* x x) x))))