Average Error: 0.4 → 0.5
Time: 30.9s
Precision: 64
\[\log \left(1 + e^{x}\right) - x \cdot y\]
\[e^{\mathsf{fma}\left(\frac{1}{8}, \left(\frac{x}{\log 2} - \frac{\frac{x}{\log 2}}{\log 2}\right) \cdot x, \mathsf{fma}\left(\frac{1}{2}, \frac{x}{\log 2}, \log \left(\log 2\right)\right)\right)} - x \cdot y\]
\log \left(1 + e^{x}\right) - x \cdot y
e^{\mathsf{fma}\left(\frac{1}{8}, \left(\frac{x}{\log 2} - \frac{\frac{x}{\log 2}}{\log 2}\right) \cdot x, \mathsf{fma}\left(\frac{1}{2}, \frac{x}{\log 2}, \log \left(\log 2\right)\right)\right)} - x \cdot y
double f(double x, double y) {
        double r4239078 = 1.0;
        double r4239079 = x;
        double r4239080 = exp(r4239079);
        double r4239081 = r4239078 + r4239080;
        double r4239082 = log(r4239081);
        double r4239083 = y;
        double r4239084 = r4239079 * r4239083;
        double r4239085 = r4239082 - r4239084;
        return r4239085;
}

double f(double x, double y) {
        double r4239086 = 0.125;
        double r4239087 = x;
        double r4239088 = 2.0;
        double r4239089 = log(r4239088);
        double r4239090 = r4239087 / r4239089;
        double r4239091 = r4239090 / r4239089;
        double r4239092 = r4239090 - r4239091;
        double r4239093 = r4239092 * r4239087;
        double r4239094 = 0.5;
        double r4239095 = log(r4239089);
        double r4239096 = fma(r4239094, r4239090, r4239095);
        double r4239097 = fma(r4239086, r4239093, r4239096);
        double r4239098 = exp(r4239097);
        double r4239099 = y;
        double r4239100 = r4239087 * r4239099;
        double r4239101 = r4239098 - r4239100;
        return r4239101;
}

Error

Bits error versus x

Bits error versus y

Target

Original0.4
Target0.0
Herbie0.5
\[\begin{array}{l} \mathbf{if}\;x \le 0:\\ \;\;\;\;\log \left(1 + e^{x}\right) - x \cdot y\\ \mathbf{else}:\\ \;\;\;\;\log \left(1 + e^{-x}\right) - \left(-x\right) \cdot \left(1 - y\right)\\ \end{array}\]

Derivation

  1. Initial program 0.4

    \[\log \left(1 + e^{x}\right) - x \cdot y\]
  2. Simplified0.4

    \[\leadsto \color{blue}{\mathsf{log1p}\left(e^{x}\right) - y \cdot x}\]
  3. Using strategy rm
  4. Applied add-exp-log0.4

    \[\leadsto \color{blue}{e^{\log \left(\mathsf{log1p}\left(e^{x}\right)\right)}} - y \cdot x\]
  5. Taylor expanded around 0 7.3

    \[\leadsto e^{\color{blue}{\left(\frac{1}{8} \cdot \frac{{x}^{2}}{\log 2} + \left(\log \left(\log 2\right) + \frac{1}{2} \cdot \frac{x}{\log 2}\right)\right) - \frac{1}{8} \cdot \frac{{x}^{2}}{{\left(\log 2\right)}^{2}}}} - y \cdot x\]
  6. Simplified0.5

    \[\leadsto e^{\color{blue}{\mathsf{fma}\left(\frac{1}{8}, x \cdot \left(\frac{x}{\log 2} - \frac{\frac{x}{\log 2}}{\log 2}\right), \mathsf{fma}\left(\frac{1}{2}, \frac{x}{\log 2}, \log \left(\log 2\right)\right)\right)}} - y \cdot x\]
  7. Final simplification0.5

    \[\leadsto e^{\mathsf{fma}\left(\frac{1}{8}, \left(\frac{x}{\log 2} - \frac{\frac{x}{\log 2}}{\log 2}\right) \cdot x, \mathsf{fma}\left(\frac{1}{2}, \frac{x}{\log 2}, \log \left(\log 2\right)\right)\right)} - x \cdot y\]

Reproduce

herbie shell --seed 2019146 +o rules:numerics
(FPCore (x y)
  :name "Logistic regression 2"

  :herbie-target
  (if (<= x 0) (- (log (+ 1 (exp x))) (* x y)) (- (log (+ 1 (exp (- x)))) (* (- x) (- 1 y))))

  (- (log (+ 1 (exp x))) (* x y)))