?

Average Accuracy: 99.3% → 99.3%
Time: 9.9s
Precision: binary64
Cost: 19456

?

\[\log \left(1 + e^{x}\right) - x \cdot y \]
\[\mathsf{fma}\left(x, -y, \mathsf{log1p}\left(e^{x}\right)\right) \]
(FPCore (x y) :precision binary64 (- (log (+ 1.0 (exp x))) (* x y)))
(FPCore (x y) :precision binary64 (fma x (- y) (log1p (exp x))))
double code(double x, double y) {
	return log((1.0 + exp(x))) - (x * y);
}
double code(double x, double y) {
	return fma(x, -y, log1p(exp(x)));
}
function code(x, y)
	return Float64(log(Float64(1.0 + exp(x))) - Float64(x * y))
end
function code(x, y)
	return fma(x, Float64(-y), log1p(exp(x)))
end
code[x_, y_] := N[(N[Log[N[(1.0 + N[Exp[x], $MachinePrecision]), $MachinePrecision]], $MachinePrecision] - N[(x * y), $MachinePrecision]), $MachinePrecision]
code[x_, y_] := N[(x * (-y) + N[Log[1 + N[Exp[x], $MachinePrecision]], $MachinePrecision]), $MachinePrecision]
\log \left(1 + e^{x}\right) - x \cdot y
\mathsf{fma}\left(x, -y, \mathsf{log1p}\left(e^{x}\right)\right)

Error?

Target

Original99.3%
Target99.9%
Herbie99.3%
\[\begin{array}{l} \mathbf{if}\;x \leq 0:\\ \;\;\;\;\log \left(1 + e^{x}\right) - x \cdot y\\ \mathbf{else}:\\ \;\;\;\;\log \left(1 + e^{-x}\right) - \left(-x\right) \cdot \left(1 - y\right)\\ \end{array} \]

Derivation?

  1. Initial program 99.3%

    \[\log \left(1 + e^{x}\right) - x \cdot y \]
  2. Simplified99.3%

    \[\leadsto \color{blue}{\mathsf{fma}\left(x, -y, \mathsf{log1p}\left(e^{x}\right)\right)} \]
    Proof

    [Start]99.3

    \[ \log \left(1 + e^{x}\right) - x \cdot y \]

    sub-neg [=>]99.3

    \[ \color{blue}{\log \left(1 + e^{x}\right) + \left(-x \cdot y\right)} \]

    +-commutative [=>]99.3

    \[ \color{blue}{\left(-x \cdot y\right) + \log \left(1 + e^{x}\right)} \]

    distribute-rgt-neg-in [=>]99.3

    \[ \color{blue}{x \cdot \left(-y\right)} + \log \left(1 + e^{x}\right) \]

    fma-def [=>]99.3

    \[ \color{blue}{\mathsf{fma}\left(x, -y, \log \left(1 + e^{x}\right)\right)} \]

    log1p-def [=>]99.3

    \[ \mathsf{fma}\left(x, -y, \color{blue}{\mathsf{log1p}\left(e^{x}\right)}\right) \]
  3. Final simplification99.3%

    \[\leadsto \mathsf{fma}\left(x, -y, \mathsf{log1p}\left(e^{x}\right)\right) \]

Alternatives

Alternative 1
Accuracy99.3%
Cost13120
\[\mathsf{log1p}\left(e^{x}\right) - x \cdot y \]
Alternative 2
Accuracy98.6%
Cost6980
\[\begin{array}{l} \mathbf{if}\;x \leq -52000000:\\ \;\;\;\;y \cdot \left(-x\right)\\ \mathbf{else}:\\ \;\;\;\;x \cdot \left(0.5 - y\right) + \log 2\\ \end{array} \]
Alternative 3
Accuracy80.4%
Cost6852
\[\begin{array}{l} \mathbf{if}\;x \leq -0.00037:\\ \;\;\;\;y \cdot \left(-x\right)\\ \mathbf{else}:\\ \;\;\;\;\log 2 + x \cdot 0.5\\ \end{array} \]
Alternative 4
Accuracy98.1%
Cost6852
\[\begin{array}{l} \mathbf{if}\;x \leq -52000000:\\ \;\;\;\;y \cdot \left(-x\right)\\ \mathbf{else}:\\ \;\;\;\;\log 2 - x \cdot y\\ \end{array} \]
Alternative 5
Accuracy79.9%
Cost6596
\[\begin{array}{l} \mathbf{if}\;x \leq -7 \cdot 10^{-6}:\\ \;\;\;\;y \cdot \left(-x\right)\\ \mathbf{else}:\\ \;\;\;\;\log 2\\ \end{array} \]
Alternative 6
Accuracy47.3%
Cost256
\[y \cdot \left(-x\right) \]
Alternative 7
Accuracy3.5%
Cost192
\[x \cdot 0.5 \]

Error

Reproduce?

herbie shell --seed 2023151 
(FPCore (x y)
  :name "Logistic regression 2"
  :precision binary64

  :herbie-target
  (if (<= x 0.0) (- (log (+ 1.0 (exp x))) (* x y)) (- (log (+ 1.0 (exp (- x)))) (* (- x) (- 1.0 y))))

  (- (log (+ 1.0 (exp x))) (* x y)))