Average Error: 0.5 → 0.5
Time: 15.5s
Precision: 64
\[\log \left(1 + e^{x}\right) - x \cdot y\]
\[\log \left(1 + e^{x}\right) + \left(-x \cdot y\right)\]
\log \left(1 + e^{x}\right) - x \cdot y
\log \left(1 + e^{x}\right) + \left(-x \cdot y\right)
double f(double x, double y) {
        double r132049 = 1.0;
        double r132050 = x;
        double r132051 = exp(r132050);
        double r132052 = r132049 + r132051;
        double r132053 = log(r132052);
        double r132054 = y;
        double r132055 = r132050 * r132054;
        double r132056 = r132053 - r132055;
        return r132056;
}

double f(double x, double y) {
        double r132057 = 1.0;
        double r132058 = x;
        double r132059 = exp(r132058);
        double r132060 = r132057 + r132059;
        double r132061 = log(r132060);
        double r132062 = y;
        double r132063 = r132058 * r132062;
        double r132064 = -r132063;
        double r132065 = r132061 + r132064;
        return r132065;
}

Error

Bits error versus x

Bits error versus y

Try it out

Your Program's Arguments

Results

Enter valid numbers for all inputs

Target

Original0.5
Target0.1
Herbie0.5
\[\begin{array}{l} \mathbf{if}\;x \le 0.0:\\ \;\;\;\;\log \left(1 + e^{x}\right) - x \cdot y\\ \mathbf{else}:\\ \;\;\;\;\log \left(1 + e^{-x}\right) - \left(-x\right) \cdot \left(1 - y\right)\\ \end{array}\]

Derivation

  1. Initial program 0.5

    \[\log \left(1 + e^{x}\right) - x \cdot y\]
  2. Using strategy rm
  3. Applied add-exp-log0.5

    \[\leadsto \color{blue}{e^{\log \left(\log \left(1 + e^{x}\right)\right)}} - x \cdot y\]
  4. Final simplification0.5

    \[\leadsto \log \left(1 + e^{x}\right) + \left(-x \cdot y\right)\]

Reproduce

herbie shell --seed 2019304 
(FPCore (x y)
  :name "Logistic regression 2"
  :precision binary64

  :herbie-target
  (if (<= x 0.0) (- (log (+ 1 (exp x))) (* x y)) (- (log (+ 1 (exp (- x)))) (* (- x) (- 1 y))))

  (- (log (+ 1 (exp x))) (* x y)))