Average Error: 0.1 → 0.1
Time: 2.7s
Precision: 64
\[0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
\[\mathsf{fma}\left(0.95492965855137202, x, \left(-0.129006137732797982\right) \cdot {x}^{3}\right)\]
0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)
\mathsf{fma}\left(0.95492965855137202, x, \left(-0.129006137732797982\right) \cdot {x}^{3}\right)
double f(double x) {
        double r23924 = 0.954929658551372;
        double r23925 = x;
        double r23926 = r23924 * r23925;
        double r23927 = 0.12900613773279798;
        double r23928 = r23925 * r23925;
        double r23929 = r23928 * r23925;
        double r23930 = r23927 * r23929;
        double r23931 = r23926 - r23930;
        return r23931;
}

double f(double x) {
        double r23932 = 0.954929658551372;
        double r23933 = x;
        double r23934 = 0.12900613773279798;
        double r23935 = -r23934;
        double r23936 = 3.0;
        double r23937 = pow(r23933, r23936);
        double r23938 = r23935 * r23937;
        double r23939 = fma(r23932, r23933, r23938);
        return r23939;
}

Error

Bits error versus x

Derivation

  1. Initial program 0.1

    \[0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
  2. Using strategy rm
  3. Applied fma-neg0.1

    \[\leadsto \color{blue}{\mathsf{fma}\left(0.95492965855137202, x, -0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)\right)}\]
  4. Simplified0.1

    \[\leadsto \mathsf{fma}\left(0.95492965855137202, x, \color{blue}{\left(-0.129006137732797982\right) \cdot {x}^{3}}\right)\]
  5. Final simplification0.1

    \[\leadsto \mathsf{fma}\left(0.95492965855137202, x, \left(-0.129006137732797982\right) \cdot {x}^{3}\right)\]

Reproduce

herbie shell --seed 2020035 +o rules:numerics
(FPCore (x)
  :name "Rosa's Benchmark"
  :precision binary64
  (- (* 0.954929658551372 x) (* 0.12900613773279798 (* (* x x) x))))