| Alternative 1 | |
|---|---|
| Accuracy | 100.0% |
| Cost | 6848 |
\[\mathsf{fma}\left(2 \cdot x, \varepsilon, \varepsilon \cdot \varepsilon\right)
\]

(FPCore (x eps) :precision binary64 (- (pow (+ x eps) 2.0) (pow x 2.0)))
(FPCore (x eps) :precision binary64 (fma (* 2.0 x) eps (* eps eps)))
double code(double x, double eps) {
return pow((x + eps), 2.0) - pow(x, 2.0);
}
double code(double x, double eps) {
return fma((2.0 * x), eps, (eps * eps));
}
function code(x, eps) return Float64((Float64(x + eps) ^ 2.0) - (x ^ 2.0)) end
function code(x, eps) return fma(Float64(2.0 * x), eps, Float64(eps * eps)) end
code[x_, eps_] := N[(N[Power[N[(x + eps), $MachinePrecision], 2.0], $MachinePrecision] - N[Power[x, 2.0], $MachinePrecision]), $MachinePrecision]
code[x_, eps_] := N[(N[(2.0 * x), $MachinePrecision] * eps + N[(eps * eps), $MachinePrecision]), $MachinePrecision]
{\left(x + \varepsilon\right)}^{2} - {x}^{2}
\mathsf{fma}\left(2 \cdot x, \varepsilon, \varepsilon \cdot \varepsilon\right)
Herbie found 5 alternatives:
| Alternative | Accuracy | Speedup |
|---|
Initial program 74.7%
Simplified100.0%
[Start]74.7% | \[ {\left(x + \varepsilon\right)}^{2} - {x}^{2}
\] |
|---|---|
unpow2 [=>]74.7% | \[ \color{blue}{\left(x + \varepsilon\right) \cdot \left(x + \varepsilon\right)} - {x}^{2}
\] |
unpow2 [=>]74.7% | \[ \left(x + \varepsilon\right) \cdot \left(x + \varepsilon\right) - \color{blue}{x \cdot x}
\] |
difference-of-squares [=>]74.7% | \[ \color{blue}{\left(\left(x + \varepsilon\right) + x\right) \cdot \left(\left(x + \varepsilon\right) - x\right)}
\] |
*-commutative [=>]74.7% | \[ \color{blue}{\left(\left(x + \varepsilon\right) - x\right) \cdot \left(\left(x + \varepsilon\right) + x\right)}
\] |
+-commutative [=>]74.7% | \[ \left(\color{blue}{\left(\varepsilon + x\right)} - x\right) \cdot \left(\left(x + \varepsilon\right) + x\right)
\] |
associate--l+ [=>]100.0% | \[ \color{blue}{\left(\varepsilon + \left(x - x\right)\right)} \cdot \left(\left(x + \varepsilon\right) + x\right)
\] |
+-inverses [=>]100.0% | \[ \left(\varepsilon + \color{blue}{0}\right) \cdot \left(\left(x + \varepsilon\right) + x\right)
\] |
+-rgt-identity [=>]100.0% | \[ \color{blue}{\varepsilon} \cdot \left(\left(x + \varepsilon\right) + x\right)
\] |
+-commutative [=>]100.0% | \[ \varepsilon \cdot \color{blue}{\left(x + \left(x + \varepsilon\right)\right)}
\] |
associate-+r+ [=>]100.0% | \[ \varepsilon \cdot \color{blue}{\left(\left(x + x\right) + \varepsilon\right)}
\] |
count-2 [=>]100.0% | \[ \varepsilon \cdot \left(\color{blue}{2 \cdot x} + \varepsilon\right)
\] |
fma-def [=>]100.0% | \[ \varepsilon \cdot \color{blue}{\mathsf{fma}\left(2, x, \varepsilon\right)}
\] |
Applied egg-rr100.0%
[Start]100.0% | \[ \varepsilon \cdot \mathsf{fma}\left(2, x, \varepsilon\right)
\] |
|---|---|
fma-udef [=>]100.0% | \[ \varepsilon \cdot \color{blue}{\left(2 \cdot x + \varepsilon\right)}
\] |
distribute-rgt-in [=>]99.9% | \[ \color{blue}{\left(2 \cdot x\right) \cdot \varepsilon + \varepsilon \cdot \varepsilon}
\] |
fma-def [=>]100.0% | \[ \color{blue}{\mathsf{fma}\left(2 \cdot x, \varepsilon, \varepsilon \cdot \varepsilon\right)}
\] |
Final simplification100.0%
| Alternative 1 | |
|---|---|
| Accuracy | 100.0% |
| Cost | 6848 |
| Alternative 2 | |
|---|---|
| Accuracy | 90.6% |
| Cost | 585 |
| Alternative 3 | |
|---|---|
| Accuracy | 90.5% |
| Cost | 584 |
| Alternative 4 | |
|---|---|
| Accuracy | 100.0% |
| Cost | 448 |
| Alternative 5 | |
|---|---|
| Accuracy | 72.6% |
| Cost | 192 |
herbie shell --seed 2023178
(FPCore (x eps)
:name "ENA, Section 1.4, Exercise 4b, n=2"
:precision binary64
:pre (and (and (<= -1000000000.0 x) (<= x 1000000000.0)) (and (<= -1.0 eps) (<= eps 1.0)))
(- (pow (+ x eps) 2.0) (pow x 2.0)))