Average Error: 0.2 → 0.1
Time: 14.8s
Precision: 64
\[0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
\[\mathsf{fma}\left(0.95492965855137202, x, -x \cdot \left(x \cdot \left(x \cdot 0.129006137732797982\right)\right)\right)\]
0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)
\mathsf{fma}\left(0.95492965855137202, x, -x \cdot \left(x \cdot \left(x \cdot 0.129006137732797982\right)\right)\right)
double f(double x) {
        double r31897 = 0.954929658551372;
        double r31898 = x;
        double r31899 = r31897 * r31898;
        double r31900 = 0.12900613773279798;
        double r31901 = r31898 * r31898;
        double r31902 = r31901 * r31898;
        double r31903 = r31900 * r31902;
        double r31904 = r31899 - r31903;
        return r31904;
}

double f(double x) {
        double r31905 = 0.954929658551372;
        double r31906 = x;
        double r31907 = 0.12900613773279798;
        double r31908 = r31906 * r31907;
        double r31909 = r31906 * r31908;
        double r31910 = r31906 * r31909;
        double r31911 = -r31910;
        double r31912 = fma(r31905, r31906, r31911);
        return r31912;
}

Error

Bits error versus x

Derivation

  1. Initial program 0.2

    \[0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
  2. Simplified0.2

    \[\leadsto \color{blue}{0.95492965855137202 \cdot x - 0.129006137732797982 \cdot {x}^{3}}\]
  3. Using strategy rm
  4. Applied fma-neg0.1

    \[\leadsto \color{blue}{\mathsf{fma}\left(0.95492965855137202, x, -0.129006137732797982 \cdot {x}^{3}\right)}\]
  5. Simplified0.1

    \[\leadsto \mathsf{fma}\left(0.95492965855137202, x, \color{blue}{-{x}^{3} \cdot 0.129006137732797982}\right)\]
  6. Using strategy rm
  7. Applied unpow30.2

    \[\leadsto \mathsf{fma}\left(0.95492965855137202, x, -\color{blue}{\left(\left(x \cdot x\right) \cdot x\right)} \cdot 0.129006137732797982\right)\]
  8. Applied associate-*l*0.1

    \[\leadsto \mathsf{fma}\left(0.95492965855137202, x, -\color{blue}{\left(x \cdot x\right) \cdot \left(x \cdot 0.129006137732797982\right)}\right)\]
  9. Using strategy rm
  10. Applied associate-*l*0.1

    \[\leadsto \mathsf{fma}\left(0.95492965855137202, x, -\color{blue}{x \cdot \left(x \cdot \left(x \cdot 0.129006137732797982\right)\right)}\right)\]
  11. Final simplification0.1

    \[\leadsto \mathsf{fma}\left(0.95492965855137202, x, -x \cdot \left(x \cdot \left(x \cdot 0.129006137732797982\right)\right)\right)\]

Reproduce

herbie shell --seed 2020047 +o rules:numerics
(FPCore (x)
  :name "Rosa's Benchmark"
  :precision binary64
  (- (* 0.954929658551372 x) (* 0.12900613773279798 (* (* x x) x))))