Average Error: 0.2 → 0.2
Time: 2.5s
Precision: 64
\[0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
\[0.95492965855137202 \cdot x - 0.129006137732797982 \cdot {\left({x}^{3}\right)}^{1}\]
0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)
0.95492965855137202 \cdot x - 0.129006137732797982 \cdot {\left({x}^{3}\right)}^{1}
double f(double x) {
        double r38007 = 0.954929658551372;
        double r38008 = x;
        double r38009 = r38007 * r38008;
        double r38010 = 0.12900613773279798;
        double r38011 = r38008 * r38008;
        double r38012 = r38011 * r38008;
        double r38013 = r38010 * r38012;
        double r38014 = r38009 - r38013;
        return r38014;
}

double f(double x) {
        double r38015 = 0.954929658551372;
        double r38016 = x;
        double r38017 = r38015 * r38016;
        double r38018 = 0.12900613773279798;
        double r38019 = 3.0;
        double r38020 = pow(r38016, r38019);
        double r38021 = 1.0;
        double r38022 = pow(r38020, r38021);
        double r38023 = r38018 * r38022;
        double r38024 = r38017 - r38023;
        return r38024;
}

Error

Bits error versus x

Try it out

Your Program's Arguments

Results

Enter valid numbers for all inputs

Derivation

  1. Initial program 0.2

    \[0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
  2. Using strategy rm
  3. Applied pow10.2

    \[\leadsto 0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot \color{blue}{{x}^{1}}\right)\]
  4. Applied pow10.2

    \[\leadsto 0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot \color{blue}{{x}^{1}}\right) \cdot {x}^{1}\right)\]
  5. Applied pow10.2

    \[\leadsto 0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(\color{blue}{{x}^{1}} \cdot {x}^{1}\right) \cdot {x}^{1}\right)\]
  6. Applied pow-prod-down0.2

    \[\leadsto 0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\color{blue}{{\left(x \cdot x\right)}^{1}} \cdot {x}^{1}\right)\]
  7. Applied pow-prod-down0.2

    \[\leadsto 0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \color{blue}{{\left(\left(x \cdot x\right) \cdot x\right)}^{1}}\]
  8. Simplified0.2

    \[\leadsto 0.95492965855137202 \cdot x - 0.129006137732797982 \cdot {\color{blue}{\left({x}^{3}\right)}}^{1}\]
  9. Final simplification0.2

    \[\leadsto 0.95492965855137202 \cdot x - 0.129006137732797982 \cdot {\left({x}^{3}\right)}^{1}\]

Reproduce

herbie shell --seed 2020064 
(FPCore (x)
  :name "Rosa's Benchmark"
  :precision binary64
  (- (* 0.954929658551372 x) (* 0.12900613773279798 (* (* x x) x))))