Average Error: 0.2 → 0.1
Time: 1.5s
Precision: 64
\[0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
\[\mathsf{fma}\left(0.95492965855137202, x, \left(-0.129006137732797982\right) \cdot {x}^{3}\right)\]
0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)
\mathsf{fma}\left(0.95492965855137202, x, \left(-0.129006137732797982\right) \cdot {x}^{3}\right)
double f(double x) {
        double r23505 = 0.954929658551372;
        double r23506 = x;
        double r23507 = r23505 * r23506;
        double r23508 = 0.12900613773279798;
        double r23509 = r23506 * r23506;
        double r23510 = r23509 * r23506;
        double r23511 = r23508 * r23510;
        double r23512 = r23507 - r23511;
        return r23512;
}

double f(double x) {
        double r23513 = 0.954929658551372;
        double r23514 = x;
        double r23515 = 0.12900613773279798;
        double r23516 = -r23515;
        double r23517 = 3.0;
        double r23518 = pow(r23514, r23517);
        double r23519 = r23516 * r23518;
        double r23520 = fma(r23513, r23514, r23519);
        return r23520;
}

Error

Bits error versus x

Derivation

  1. Initial program 0.2

    \[0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
  2. Using strategy rm
  3. Applied fma-neg0.2

    \[\leadsto \color{blue}{\mathsf{fma}\left(0.95492965855137202, x, -0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)\right)}\]
  4. Simplified0.1

    \[\leadsto \mathsf{fma}\left(0.95492965855137202, x, \color{blue}{\left(-0.129006137732797982\right) \cdot {x}^{3}}\right)\]
  5. Final simplification0.1

    \[\leadsto \mathsf{fma}\left(0.95492965855137202, x, \left(-0.129006137732797982\right) \cdot {x}^{3}\right)\]

Reproduce

herbie shell --seed 2020027 +o rules:numerics
(FPCore (x)
  :name "Rosa's Benchmark"
  :precision binary64
  (- (* 0.954929658551372 x) (* 0.12900613773279798 (* (* x x) x))))