Average Error: 0.6 → 0.5
Time: 11.9s
Precision: 64
\[\log \left(1 + e^{x}\right) - x \cdot y\]
\[e^{\mathsf{fma}\left(\frac{x}{\log 2}, 0.5, \log \left(\log 2\right)\right) + \left(x \cdot \frac{x}{\log 2}\right) \cdot \left(0.25 - \left(\frac{0.125}{\log 2} + \frac{\frac{1}{2}}{{2}^{2}}\right)\right)} - x \cdot y\]
\log \left(1 + e^{x}\right) - x \cdot y
e^{\mathsf{fma}\left(\frac{x}{\log 2}, 0.5, \log \left(\log 2\right)\right) + \left(x \cdot \frac{x}{\log 2}\right) \cdot \left(0.25 - \left(\frac{0.125}{\log 2} + \frac{\frac{1}{2}}{{2}^{2}}\right)\right)} - x \cdot y
double f(double x, double y) {
        double r177530 = 1.0;
        double r177531 = x;
        double r177532 = exp(r177531);
        double r177533 = r177530 + r177532;
        double r177534 = log(r177533);
        double r177535 = y;
        double r177536 = r177531 * r177535;
        double r177537 = r177534 - r177536;
        return r177537;
}

double f(double x, double y) {
        double r177538 = x;
        double r177539 = 2.0;
        double r177540 = log(r177539);
        double r177541 = r177538 / r177540;
        double r177542 = 0.5;
        double r177543 = log(r177540);
        double r177544 = fma(r177541, r177542, r177543);
        double r177545 = r177538 * r177541;
        double r177546 = 0.25;
        double r177547 = 0.125;
        double r177548 = r177547 / r177540;
        double r177549 = 0.5;
        double r177550 = 2.0;
        double r177551 = pow(r177539, r177550);
        double r177552 = r177549 / r177551;
        double r177553 = r177548 + r177552;
        double r177554 = r177546 - r177553;
        double r177555 = r177545 * r177554;
        double r177556 = r177544 + r177555;
        double r177557 = exp(r177556);
        double r177558 = y;
        double r177559 = r177538 * r177558;
        double r177560 = r177557 - r177559;
        return r177560;
}

Error

Bits error versus x

Bits error versus y

Target

Original0.6
Target0.1
Herbie0.5
\[\begin{array}{l} \mathbf{if}\;x \le 0.0:\\ \;\;\;\;\log \left(1 + e^{x}\right) - x \cdot y\\ \mathbf{else}:\\ \;\;\;\;\log \left(1 + e^{-x}\right) - \left(-x\right) \cdot \left(1 - y\right)\\ \end{array}\]

Derivation

  1. Initial program 0.6

    \[\log \left(1 + e^{x}\right) - x \cdot y\]
  2. Using strategy rm
  3. Applied add-exp-log0.6

    \[\leadsto \color{blue}{e^{\log \left(\log \left(1 + e^{x}\right)\right)}} - x \cdot y\]
  4. Taylor expanded around 0 7.9

    \[\leadsto e^{\color{blue}{\left(0.25 \cdot \frac{{x}^{2}}{\log 2} + \left(\log \left(\log 2\right) + 0.5 \cdot \frac{x}{\log 2}\right)\right) - \left(0.125 \cdot \frac{{x}^{2}}{{\left(\log 2\right)}^{2}} + \frac{1}{2} \cdot \frac{{x}^{2}}{\log 2 \cdot {2}^{2}}\right)}} - x \cdot y\]
  5. Simplified0.5

    \[\leadsto e^{\color{blue}{\mathsf{fma}\left(\frac{x}{\log 2}, 0.5, \log \left(\log 2\right)\right) + \left(x \cdot \frac{x}{\log 2}\right) \cdot \left(0.25 - \left(\frac{0.125}{\log 2} + \frac{\frac{1}{2}}{{2}^{2}}\right)\right)}} - x \cdot y\]
  6. Final simplification0.5

    \[\leadsto e^{\mathsf{fma}\left(\frac{x}{\log 2}, 0.5, \log \left(\log 2\right)\right) + \left(x \cdot \frac{x}{\log 2}\right) \cdot \left(0.25 - \left(\frac{0.125}{\log 2} + \frac{\frac{1}{2}}{{2}^{2}}\right)\right)} - x \cdot y\]

Reproduce

herbie shell --seed 2019351 +o rules:numerics
(FPCore (x y)
  :name "Logistic regression 2"
  :precision binary64

  :herbie-target
  (if (<= x 0.0) (- (log (+ 1 (exp x))) (* x y)) (- (log (+ 1 (exp (- x)))) (* (- x) (- 1 y))))

  (- (log (+ 1 (exp x))) (* x y)))