?

Average Accuracy: 100.0% → 100.0%
Time: 2.8s
Precision: binary64
Cost: 6720

?

\[\left(x + 1\right) \cdot y - x \]
\[\mathsf{fma}\left(x, y, y - x\right) \]
(FPCore (x y) :precision binary64 (- (* (+ x 1.0) y) x))
(FPCore (x y) :precision binary64 (fma x y (- y x)))
double code(double x, double y) {
	return ((x + 1.0) * y) - x;
}
double code(double x, double y) {
	return fma(x, y, (y - x));
}
function code(x, y)
	return Float64(Float64(Float64(x + 1.0) * y) - x)
end
function code(x, y)
	return fma(x, y, Float64(y - x))
end
code[x_, y_] := N[(N[(N[(x + 1.0), $MachinePrecision] * y), $MachinePrecision] - x), $MachinePrecision]
code[x_, y_] := N[(x * y + N[(y - x), $MachinePrecision]), $MachinePrecision]
\left(x + 1\right) \cdot y - x
\mathsf{fma}\left(x, y, y - x\right)

Error?

Derivation?

  1. Initial program 100.0%

    \[\left(x + 1\right) \cdot y - x \]
  2. Simplified100.0%

    \[\leadsto \color{blue}{\mathsf{fma}\left(x, y, y - x\right)} \]
    Proof

    [Start]100.0

    \[ \left(x + 1\right) \cdot y - x \]

    sub-neg [=>]100.0

    \[ \color{blue}{\left(x + 1\right) \cdot y + \left(-x\right)} \]

    *-commutative [=>]100.0

    \[ \color{blue}{y \cdot \left(x + 1\right)} + \left(-x\right) \]

    distribute-lft-in [=>]100.0

    \[ \color{blue}{\left(y \cdot x + y \cdot 1\right)} + \left(-x\right) \]

    associate-+l+ [=>]100.0

    \[ \color{blue}{y \cdot x + \left(y \cdot 1 + \left(-x\right)\right)} \]

    *-commutative [=>]100.0

    \[ \color{blue}{x \cdot y} + \left(y \cdot 1 + \left(-x\right)\right) \]

    *-lft-identity [<=]100.0

    \[ x \cdot \color{blue}{\left(1 \cdot y\right)} + \left(y \cdot 1 + \left(-x\right)\right) \]

    *-commutative [<=]100.0

    \[ x \cdot \color{blue}{\left(y \cdot 1\right)} + \left(y \cdot 1 + \left(-x\right)\right) \]

    fma-def [=>]100.0

    \[ \color{blue}{\mathsf{fma}\left(x, y \cdot 1, y \cdot 1 + \left(-x\right)\right)} \]

    *-commutative [=>]100.0

    \[ \mathsf{fma}\left(x, \color{blue}{1 \cdot y}, y \cdot 1 + \left(-x\right)\right) \]

    *-lft-identity [=>]100.0

    \[ \mathsf{fma}\left(x, \color{blue}{y}, y \cdot 1 + \left(-x\right)\right) \]

    sub-neg [<=]100.0

    \[ \mathsf{fma}\left(x, y, \color{blue}{y \cdot 1 - x}\right) \]

    *-commutative [=>]100.0

    \[ \mathsf{fma}\left(x, y, \color{blue}{1 \cdot y} - x\right) \]

    *-lft-identity [=>]100.0

    \[ \mathsf{fma}\left(x, y, \color{blue}{y} - x\right) \]
  3. Final simplification100.0%

    \[\leadsto \mathsf{fma}\left(x, y, y - x\right) \]

Alternatives

Alternative 1
Accuracy70.8%
Cost656
\[\begin{array}{l} \mathbf{if}\;y \leq -5 \cdot 10^{-25}:\\ \;\;\;\;y\\ \mathbf{elif}\;y \leq 3.3 \cdot 10^{-40}:\\ \;\;\;\;-x\\ \mathbf{elif}\;y \leq 1.15 \cdot 10^{-31}:\\ \;\;\;\;y\\ \mathbf{elif}\;y \leq 4.4 \cdot 10^{-13}:\\ \;\;\;\;-x\\ \mathbf{else}:\\ \;\;\;\;y\\ \end{array} \]
Alternative 2
Accuracy98.7%
Cost585
\[\begin{array}{l} \mathbf{if}\;y \leq -7.6 \lor \neg \left(y \leq 0.00092\right):\\ \;\;\;\;y \cdot \left(x + 1\right)\\ \mathbf{else}:\\ \;\;\;\;y - x\\ \end{array} \]
Alternative 3
Accuracy98.7%
Cost584
\[\begin{array}{l} \mathbf{if}\;y \leq -7.6:\\ \;\;\;\;y + x \cdot y\\ \mathbf{elif}\;y \leq 0.00092:\\ \;\;\;\;y - x\\ \mathbf{else}:\\ \;\;\;\;y \cdot \left(x + 1\right)\\ \end{array} \]
Alternative 4
Accuracy84.7%
Cost456
\[\begin{array}{l} \mathbf{if}\;x \leq 54000000000:\\ \;\;\;\;y - x\\ \mathbf{elif}\;x \leq 1.16 \cdot 10^{+78}:\\ \;\;\;\;x \cdot y\\ \mathbf{else}:\\ \;\;\;\;y - x\\ \end{array} \]
Alternative 5
Accuracy100.0%
Cost448
\[y + x \cdot \left(y + -1\right) \]
Alternative 6
Accuracy43.5%
Cost64
\[y \]

Error

Reproduce?

herbie shell --seed 2023129 
(FPCore (x y)
  :name "Data.Colour.SRGB:transferFunction from colour-2.3.3"
  :precision binary64
  (- (* (+ x 1.0) y) x))