Average Error: 0.1 → 0.1
Time: 2.4s
Precision: 64
\[0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
\[0.95492965855137202 \cdot x - 1 \cdot \left({x}^{3} \cdot 0.129006137732797982\right)\]
0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)
0.95492965855137202 \cdot x - 1 \cdot \left({x}^{3} \cdot 0.129006137732797982\right)
double code(double x) {
	return ((0.954929658551372 * x) - (0.12900613773279798 * ((x * x) * x)));
}
double code(double x) {
	return ((0.954929658551372 * x) - (1.0 * (pow(x, 3.0) * 0.12900613773279798)));
}

Error

Bits error versus x

Try it out

Your Program's Arguments

Results

Enter valid numbers for all inputs

Derivation

  1. Initial program 0.1

    \[0.95492965855137202 \cdot x - 0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
  2. Using strategy rm
  3. Applied *-un-lft-identity0.1

    \[\leadsto 0.95492965855137202 \cdot x - \color{blue}{\left(1 \cdot 0.129006137732797982\right)} \cdot \left(\left(x \cdot x\right) \cdot x\right)\]
  4. Applied associate-*l*0.1

    \[\leadsto 0.95492965855137202 \cdot x - \color{blue}{1 \cdot \left(0.129006137732797982 \cdot \left(\left(x \cdot x\right) \cdot x\right)\right)}\]
  5. Simplified0.1

    \[\leadsto 0.95492965855137202 \cdot x - 1 \cdot \color{blue}{\left({x}^{3} \cdot 0.129006137732797982\right)}\]
  6. Final simplification0.1

    \[\leadsto 0.95492965855137202 \cdot x - 1 \cdot \left({x}^{3} \cdot 0.129006137732797982\right)\]

Reproduce

herbie shell --seed 2020057 
(FPCore (x)
  :name "Rosa's Benchmark"
  :precision binary64
  (- (* 0.954929658551372 x) (* 0.12900613773279798 (* (* x x) x))))