Diagrams.Trail:splitAtParam from diagrams-lib-1.3.0.3, C

Percentage Accurate: 100.0% → 100.0%
Time: 4.2s
Alternatives: 8
Speedup: 1.0×

Specification

?
\[\begin{array}{l} \\ \frac{x - y}{1 - y} \end{array} \]
(FPCore (x y) :precision binary64 (/ (- x y) (- 1.0 y)))
double code(double x, double y) {
	return (x - y) / (1.0 - y);
}
real(8) function code(x, y)
    real(8), intent (in) :: x
    real(8), intent (in) :: y
    code = (x - y) / (1.0d0 - y)
end function
public static double code(double x, double y) {
	return (x - y) / (1.0 - y);
}
def code(x, y):
	return (x - y) / (1.0 - y)
function code(x, y)
	return Float64(Float64(x - y) / Float64(1.0 - y))
end
function tmp = code(x, y)
	tmp = (x - y) / (1.0 - y);
end
code[x_, y_] := N[(N[(x - y), $MachinePrecision] / N[(1.0 - y), $MachinePrecision]), $MachinePrecision]
\begin{array}{l}

\\
\frac{x - y}{1 - y}
\end{array}

Sampling outcomes in binary64 precision:

Local Percentage Accuracy vs ?

The average percentage accuracy by input value. Horizontal axis shows value of an input variable; the variable is choosen in the title. Vertical axis is accuracy; higher is better. Red represent the original program, while blue represents Herbie's suggestion. These can be toggled with buttons below the plot. The line is an average while dots represent individual samples.

Accuracy vs Speed?

Herbie found 8 alternatives:

AlternativeAccuracySpeedup
The accuracy (vertical axis) and speed (horizontal axis) of each alternatives. Up and to the right is better. The red square shows the initial program, and each blue circle shows an alternative.The line shows the best available speed-accuracy tradeoffs.

Initial Program: 100.0% accurate, 1.0× speedup?

\[\begin{array}{l} \\ \frac{x - y}{1 - y} \end{array} \]
(FPCore (x y) :precision binary64 (/ (- x y) (- 1.0 y)))
double code(double x, double y) {
	return (x - y) / (1.0 - y);
}
real(8) function code(x, y)
    real(8), intent (in) :: x
    real(8), intent (in) :: y
    code = (x - y) / (1.0d0 - y)
end function
public static double code(double x, double y) {
	return (x - y) / (1.0 - y);
}
def code(x, y):
	return (x - y) / (1.0 - y)
function code(x, y)
	return Float64(Float64(x - y) / Float64(1.0 - y))
end
function tmp = code(x, y)
	tmp = (x - y) / (1.0 - y);
end
code[x_, y_] := N[(N[(x - y), $MachinePrecision] / N[(1.0 - y), $MachinePrecision]), $MachinePrecision]
\begin{array}{l}

\\
\frac{x - y}{1 - y}
\end{array}

Alternative 1: 100.0% accurate, 1.0× speedup?

\[\begin{array}{l} \\ \frac{x - y}{1 - y} \end{array} \]
(FPCore (x y) :precision binary64 (/ (- x y) (- 1.0 y)))
double code(double x, double y) {
	return (x - y) / (1.0 - y);
}
real(8) function code(x, y)
    real(8), intent (in) :: x
    real(8), intent (in) :: y
    code = (x - y) / (1.0d0 - y)
end function
public static double code(double x, double y) {
	return (x - y) / (1.0 - y);
}
def code(x, y):
	return (x - y) / (1.0 - y)
function code(x, y)
	return Float64(Float64(x - y) / Float64(1.0 - y))
end
function tmp = code(x, y)
	tmp = (x - y) / (1.0 - y);
end
code[x_, y_] := N[(N[(x - y), $MachinePrecision] / N[(1.0 - y), $MachinePrecision]), $MachinePrecision]
\begin{array}{l}

\\
\frac{x - y}{1 - y}
\end{array}
Derivation
  1. Initial program 100.0%

    \[\frac{x - y}{1 - y} \]
  2. Final simplification100.0%

    \[\leadsto \frac{x - y}{1 - y} \]

Alternative 2: 98.2% accurate, 0.6× speedup?

\[\begin{array}{l} \\ \begin{array}{l} \mathbf{if}\;y \leq -1.25 \lor \neg \left(y \leq 1\right):\\ \;\;\;\;1 + \frac{1 - x}{y}\\ \mathbf{else}:\\ \;\;\;\;x - y\\ \end{array} \end{array} \]
(FPCore (x y)
 :precision binary64
 (if (or (<= y -1.25) (not (<= y 1.0))) (+ 1.0 (/ (- 1.0 x) y)) (- x y)))
double code(double x, double y) {
	double tmp;
	if ((y <= -1.25) || !(y <= 1.0)) {
		tmp = 1.0 + ((1.0 - x) / y);
	} else {
		tmp = x - y;
	}
	return tmp;
}
real(8) function code(x, y)
    real(8), intent (in) :: x
    real(8), intent (in) :: y
    real(8) :: tmp
    if ((y <= (-1.25d0)) .or. (.not. (y <= 1.0d0))) then
        tmp = 1.0d0 + ((1.0d0 - x) / y)
    else
        tmp = x - y
    end if
    code = tmp
end function
public static double code(double x, double y) {
	double tmp;
	if ((y <= -1.25) || !(y <= 1.0)) {
		tmp = 1.0 + ((1.0 - x) / y);
	} else {
		tmp = x - y;
	}
	return tmp;
}
def code(x, y):
	tmp = 0
	if (y <= -1.25) or not (y <= 1.0):
		tmp = 1.0 + ((1.0 - x) / y)
	else:
		tmp = x - y
	return tmp
function code(x, y)
	tmp = 0.0
	if ((y <= -1.25) || !(y <= 1.0))
		tmp = Float64(1.0 + Float64(Float64(1.0 - x) / y));
	else
		tmp = Float64(x - y);
	end
	return tmp
end
function tmp_2 = code(x, y)
	tmp = 0.0;
	if ((y <= -1.25) || ~((y <= 1.0)))
		tmp = 1.0 + ((1.0 - x) / y);
	else
		tmp = x - y;
	end
	tmp_2 = tmp;
end
code[x_, y_] := If[Or[LessEqual[y, -1.25], N[Not[LessEqual[y, 1.0]], $MachinePrecision]], N[(1.0 + N[(N[(1.0 - x), $MachinePrecision] / y), $MachinePrecision]), $MachinePrecision], N[(x - y), $MachinePrecision]]
\begin{array}{l}

\\
\begin{array}{l}
\mathbf{if}\;y \leq -1.25 \lor \neg \left(y \leq 1\right):\\
\;\;\;\;1 + \frac{1 - x}{y}\\

\mathbf{else}:\\
\;\;\;\;x - y\\


\end{array}
\end{array}
Derivation
  1. Split input into 2 regimes
  2. if y < -1.25 or 1 < y

    1. Initial program 100.0%

      \[\frac{x - y}{1 - y} \]
    2. Step-by-step derivation
      1. sub-neg100.0%

        \[\leadsto \frac{\color{blue}{x + \left(-y\right)}}{1 - y} \]
      2. +-commutative100.0%

        \[\leadsto \frac{\color{blue}{\left(-y\right) + x}}{1 - y} \]
      3. neg-sub0100.0%

        \[\leadsto \frac{\color{blue}{\left(0 - y\right)} + x}{1 - y} \]
      4. associate-+l-100.0%

        \[\leadsto \frac{\color{blue}{0 - \left(y - x\right)}}{1 - y} \]
      5. sub0-neg100.0%

        \[\leadsto \frac{\color{blue}{-\left(y - x\right)}}{1 - y} \]
      6. neg-mul-1100.0%

        \[\leadsto \frac{\color{blue}{-1 \cdot \left(y - x\right)}}{1 - y} \]
      7. sub-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{1 + \left(-y\right)}} \]
      8. +-commutative100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(-y\right) + 1}} \]
      9. neg-sub0100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(0 - y\right)} + 1} \]
      10. associate-+l-100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{0 - \left(y - 1\right)}} \]
      11. sub0-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-\left(y - 1\right)}} \]
      12. neg-mul-1100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-1 \cdot \left(y - 1\right)}} \]
      13. times-frac100.0%

        \[\leadsto \color{blue}{\frac{-1}{-1} \cdot \frac{y - x}{y - 1}} \]
      14. metadata-eval100.0%

        \[\leadsto \color{blue}{1} \cdot \frac{y - x}{y - 1} \]
      15. *-lft-identity100.0%

        \[\leadsto \color{blue}{\frac{y - x}{y - 1}} \]
      16. sub-neg100.0%

        \[\leadsto \frac{y - x}{\color{blue}{y + \left(-1\right)}} \]
      17. metadata-eval100.0%

        \[\leadsto \frac{y - x}{y + \color{blue}{-1}} \]
    3. Simplified100.0%

      \[\leadsto \color{blue}{\frac{y - x}{y + -1}} \]
    4. Taylor expanded in y around inf 98.0%

      \[\leadsto \color{blue}{\frac{1}{y} + \left(1 + -1 \cdot \frac{x}{y}\right)} \]
    5. Step-by-step derivation
      1. +-commutative98.0%

        \[\leadsto \frac{1}{y} + \color{blue}{\left(-1 \cdot \frac{x}{y} + 1\right)} \]
      2. associate-+r+98.0%

        \[\leadsto \color{blue}{\left(\frac{1}{y} + -1 \cdot \frac{x}{y}\right) + 1} \]
      3. mul-1-neg98.0%

        \[\leadsto \left(\frac{1}{y} + \color{blue}{\left(-\frac{x}{y}\right)}\right) + 1 \]
      4. unsub-neg98.0%

        \[\leadsto \color{blue}{\left(\frac{1}{y} - \frac{x}{y}\right)} + 1 \]
      5. div-sub98.0%

        \[\leadsto \color{blue}{\frac{1 - x}{y}} + 1 \]
      6. unsub-neg98.0%

        \[\leadsto \frac{\color{blue}{1 + \left(-x\right)}}{y} + 1 \]
      7. mul-1-neg98.0%

        \[\leadsto \frac{1 + \color{blue}{-1 \cdot x}}{y} + 1 \]
      8. +-commutative98.0%

        \[\leadsto \frac{\color{blue}{-1 \cdot x + 1}}{y} + 1 \]
      9. metadata-eval98.0%

        \[\leadsto \frac{-1 \cdot x + \color{blue}{-1 \cdot -1}}{y} + 1 \]
      10. distribute-lft-in98.0%

        \[\leadsto \frac{\color{blue}{-1 \cdot \left(x + -1\right)}}{y} + 1 \]
      11. metadata-eval98.0%

        \[\leadsto \frac{-1 \cdot \left(x + \color{blue}{\left(-1\right)}\right)}{y} + 1 \]
      12. sub-neg98.0%

        \[\leadsto \frac{-1 \cdot \color{blue}{\left(x - 1\right)}}{y} + 1 \]
      13. associate-*r/98.0%

        \[\leadsto \color{blue}{-1 \cdot \frac{x - 1}{y}} + 1 \]
      14. +-commutative98.0%

        \[\leadsto \color{blue}{1 + -1 \cdot \frac{x - 1}{y}} \]
      15. associate-*r/98.0%

        \[\leadsto 1 + \color{blue}{\frac{-1 \cdot \left(x - 1\right)}{y}} \]
      16. sub-neg98.0%

        \[\leadsto 1 + \frac{-1 \cdot \color{blue}{\left(x + \left(-1\right)\right)}}{y} \]
      17. metadata-eval98.0%

        \[\leadsto 1 + \frac{-1 \cdot \left(x + \color{blue}{-1}\right)}{y} \]
      18. distribute-lft-in98.0%

        \[\leadsto 1 + \frac{\color{blue}{-1 \cdot x + -1 \cdot -1}}{y} \]
      19. metadata-eval98.0%

        \[\leadsto 1 + \frac{-1 \cdot x + \color{blue}{1}}{y} \]
      20. +-commutative98.0%

        \[\leadsto 1 + \frac{\color{blue}{1 + -1 \cdot x}}{y} \]
      21. mul-1-neg98.0%

        \[\leadsto 1 + \frac{1 + \color{blue}{\left(-x\right)}}{y} \]
      22. unsub-neg98.0%

        \[\leadsto 1 + \frac{\color{blue}{1 - x}}{y} \]
    6. Simplified98.0%

      \[\leadsto \color{blue}{1 + \frac{1 - x}{y}} \]

    if -1.25 < y < 1

    1. Initial program 100.0%

      \[\frac{x - y}{1 - y} \]
    2. Step-by-step derivation
      1. sub-neg100.0%

        \[\leadsto \frac{\color{blue}{x + \left(-y\right)}}{1 - y} \]
      2. +-commutative100.0%

        \[\leadsto \frac{\color{blue}{\left(-y\right) + x}}{1 - y} \]
      3. neg-sub0100.0%

        \[\leadsto \frac{\color{blue}{\left(0 - y\right)} + x}{1 - y} \]
      4. associate-+l-100.0%

        \[\leadsto \frac{\color{blue}{0 - \left(y - x\right)}}{1 - y} \]
      5. sub0-neg100.0%

        \[\leadsto \frac{\color{blue}{-\left(y - x\right)}}{1 - y} \]
      6. neg-mul-1100.0%

        \[\leadsto \frac{\color{blue}{-1 \cdot \left(y - x\right)}}{1 - y} \]
      7. sub-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{1 + \left(-y\right)}} \]
      8. +-commutative100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(-y\right) + 1}} \]
      9. neg-sub0100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(0 - y\right)} + 1} \]
      10. associate-+l-100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{0 - \left(y - 1\right)}} \]
      11. sub0-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-\left(y - 1\right)}} \]
      12. neg-mul-1100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-1 \cdot \left(y - 1\right)}} \]
      13. times-frac100.0%

        \[\leadsto \color{blue}{\frac{-1}{-1} \cdot \frac{y - x}{y - 1}} \]
      14. metadata-eval100.0%

        \[\leadsto \color{blue}{1} \cdot \frac{y - x}{y - 1} \]
      15. *-lft-identity100.0%

        \[\leadsto \color{blue}{\frac{y - x}{y - 1}} \]
      16. sub-neg100.0%

        \[\leadsto \frac{y - x}{\color{blue}{y + \left(-1\right)}} \]
      17. metadata-eval100.0%

        \[\leadsto \frac{y - x}{y + \color{blue}{-1}} \]
    3. Simplified100.0%

      \[\leadsto \color{blue}{\frac{y - x}{y + -1}} \]
    4. Taylor expanded in y around 0 99.4%

      \[\leadsto \color{blue}{-1 \cdot \left(y \cdot \left(1 + -1 \cdot x\right)\right) + \left(-1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right) + x\right)} \]
    5. Step-by-step derivation
      1. associate-+r+99.4%

        \[\leadsto \color{blue}{\left(-1 \cdot \left(y \cdot \left(1 + -1 \cdot x\right)\right) + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right) + x} \]
      2. +-commutative99.4%

        \[\leadsto \color{blue}{x + \left(-1 \cdot \left(y \cdot \left(1 + -1 \cdot x\right)\right) + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right)} \]
      3. associate-*r*99.4%

        \[\leadsto x + \left(\color{blue}{\left(-1 \cdot y\right) \cdot \left(1 + -1 \cdot x\right)} + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right) \]
      4. neg-mul-199.4%

        \[\leadsto x + \left(\color{blue}{\left(-y\right)} \cdot \left(1 + -1 \cdot x\right) + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right) \]
      5. associate-*r*99.4%

        \[\leadsto x + \left(\left(-y\right) \cdot \left(1 + -1 \cdot x\right) + \color{blue}{\left(-1 \cdot {y}^{2}\right) \cdot \left(1 + -1 \cdot x\right)}\right) \]
      6. distribute-rgt-out99.4%

        \[\leadsto x + \color{blue}{\left(1 + -1 \cdot x\right) \cdot \left(\left(-y\right) + -1 \cdot {y}^{2}\right)} \]
      7. mul-1-neg99.4%

        \[\leadsto x + \left(1 + \color{blue}{\left(-x\right)}\right) \cdot \left(\left(-y\right) + -1 \cdot {y}^{2}\right) \]
      8. unsub-neg99.4%

        \[\leadsto x + \color{blue}{\left(1 - x\right)} \cdot \left(\left(-y\right) + -1 \cdot {y}^{2}\right) \]
      9. mul-1-neg99.4%

        \[\leadsto x + \left(1 - x\right) \cdot \left(\left(-y\right) + \color{blue}{\left(-{y}^{2}\right)}\right) \]
      10. unsub-neg99.4%

        \[\leadsto x + \left(1 - x\right) \cdot \color{blue}{\left(\left(-y\right) - {y}^{2}\right)} \]
      11. unpow299.4%

        \[\leadsto x + \left(1 - x\right) \cdot \left(\left(-y\right) - \color{blue}{y \cdot y}\right) \]
    6. Simplified99.4%

      \[\leadsto \color{blue}{x + \left(1 - x\right) \cdot \left(\left(-y\right) - y \cdot y\right)} \]
    7. Taylor expanded in x around 0 98.7%

      \[\leadsto x + \color{blue}{-1 \cdot \left(y + {y}^{2}\right)} \]
    8. Step-by-step derivation
      1. unpow298.7%

        \[\leadsto x + -1 \cdot \left(y + \color{blue}{y \cdot y}\right) \]
      2. distribute-lft-in98.7%

        \[\leadsto x + \color{blue}{\left(-1 \cdot y + -1 \cdot \left(y \cdot y\right)\right)} \]
      3. mul-1-neg98.7%

        \[\leadsto x + \left(-1 \cdot y + \color{blue}{\left(-y \cdot y\right)}\right) \]
      4. distribute-lft-neg-in98.7%

        \[\leadsto x + \left(-1 \cdot y + \color{blue}{\left(-y\right) \cdot y}\right) \]
      5. mul-1-neg98.7%

        \[\leadsto x + \left(-1 \cdot y + \color{blue}{\left(-1 \cdot y\right)} \cdot y\right) \]
      6. distribute-rgt-in98.7%

        \[\leadsto x + \color{blue}{y \cdot \left(-1 + -1 \cdot y\right)} \]
      7. mul-1-neg98.7%

        \[\leadsto x + y \cdot \left(-1 + \color{blue}{\left(-y\right)}\right) \]
      8. sub-neg98.7%

        \[\leadsto x + y \cdot \color{blue}{\left(-1 - y\right)} \]
    9. Simplified98.7%

      \[\leadsto x + \color{blue}{y \cdot \left(-1 - y\right)} \]
    10. Taylor expanded in y around 0 97.8%

      \[\leadsto \color{blue}{-1 \cdot y + x} \]
    11. Step-by-step derivation
      1. neg-mul-197.8%

        \[\leadsto \color{blue}{\left(-y\right)} + x \]
      2. +-commutative97.8%

        \[\leadsto \color{blue}{x + \left(-y\right)} \]
      3. unsub-neg97.8%

        \[\leadsto \color{blue}{x - y} \]
    12. Simplified97.8%

      \[\leadsto \color{blue}{x - y} \]
  3. Recombined 2 regimes into one program.
  4. Final simplification97.9%

    \[\leadsto \begin{array}{l} \mathbf{if}\;y \leq -1.25 \lor \neg \left(y \leq 1\right):\\ \;\;\;\;1 + \frac{1 - x}{y}\\ \mathbf{else}:\\ \;\;\;\;x - y\\ \end{array} \]

Alternative 3: 98.5% accurate, 0.6× speedup?

\[\begin{array}{l} \\ \begin{array}{l} \mathbf{if}\;y \leq -1 \lor \neg \left(y \leq 1\right):\\ \;\;\;\;1 + \frac{1 - x}{y}\\ \mathbf{else}:\\ \;\;\;\;x + y \cdot \left(-1 - y\right)\\ \end{array} \end{array} \]
(FPCore (x y)
 :precision binary64
 (if (or (<= y -1.0) (not (<= y 1.0)))
   (+ 1.0 (/ (- 1.0 x) y))
   (+ x (* y (- -1.0 y)))))
double code(double x, double y) {
	double tmp;
	if ((y <= -1.0) || !(y <= 1.0)) {
		tmp = 1.0 + ((1.0 - x) / y);
	} else {
		tmp = x + (y * (-1.0 - y));
	}
	return tmp;
}
real(8) function code(x, y)
    real(8), intent (in) :: x
    real(8), intent (in) :: y
    real(8) :: tmp
    if ((y <= (-1.0d0)) .or. (.not. (y <= 1.0d0))) then
        tmp = 1.0d0 + ((1.0d0 - x) / y)
    else
        tmp = x + (y * ((-1.0d0) - y))
    end if
    code = tmp
end function
public static double code(double x, double y) {
	double tmp;
	if ((y <= -1.0) || !(y <= 1.0)) {
		tmp = 1.0 + ((1.0 - x) / y);
	} else {
		tmp = x + (y * (-1.0 - y));
	}
	return tmp;
}
def code(x, y):
	tmp = 0
	if (y <= -1.0) or not (y <= 1.0):
		tmp = 1.0 + ((1.0 - x) / y)
	else:
		tmp = x + (y * (-1.0 - y))
	return tmp
function code(x, y)
	tmp = 0.0
	if ((y <= -1.0) || !(y <= 1.0))
		tmp = Float64(1.0 + Float64(Float64(1.0 - x) / y));
	else
		tmp = Float64(x + Float64(y * Float64(-1.0 - y)));
	end
	return tmp
end
function tmp_2 = code(x, y)
	tmp = 0.0;
	if ((y <= -1.0) || ~((y <= 1.0)))
		tmp = 1.0 + ((1.0 - x) / y);
	else
		tmp = x + (y * (-1.0 - y));
	end
	tmp_2 = tmp;
end
code[x_, y_] := If[Or[LessEqual[y, -1.0], N[Not[LessEqual[y, 1.0]], $MachinePrecision]], N[(1.0 + N[(N[(1.0 - x), $MachinePrecision] / y), $MachinePrecision]), $MachinePrecision], N[(x + N[(y * N[(-1.0 - y), $MachinePrecision]), $MachinePrecision]), $MachinePrecision]]
\begin{array}{l}

\\
\begin{array}{l}
\mathbf{if}\;y \leq -1 \lor \neg \left(y \leq 1\right):\\
\;\;\;\;1 + \frac{1 - x}{y}\\

\mathbf{else}:\\
\;\;\;\;x + y \cdot \left(-1 - y\right)\\


\end{array}
\end{array}
Derivation
  1. Split input into 2 regimes
  2. if y < -1 or 1 < y

    1. Initial program 100.0%

      \[\frac{x - y}{1 - y} \]
    2. Step-by-step derivation
      1. sub-neg100.0%

        \[\leadsto \frac{\color{blue}{x + \left(-y\right)}}{1 - y} \]
      2. +-commutative100.0%

        \[\leadsto \frac{\color{blue}{\left(-y\right) + x}}{1 - y} \]
      3. neg-sub0100.0%

        \[\leadsto \frac{\color{blue}{\left(0 - y\right)} + x}{1 - y} \]
      4. associate-+l-100.0%

        \[\leadsto \frac{\color{blue}{0 - \left(y - x\right)}}{1 - y} \]
      5. sub0-neg100.0%

        \[\leadsto \frac{\color{blue}{-\left(y - x\right)}}{1 - y} \]
      6. neg-mul-1100.0%

        \[\leadsto \frac{\color{blue}{-1 \cdot \left(y - x\right)}}{1 - y} \]
      7. sub-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{1 + \left(-y\right)}} \]
      8. +-commutative100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(-y\right) + 1}} \]
      9. neg-sub0100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(0 - y\right)} + 1} \]
      10. associate-+l-100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{0 - \left(y - 1\right)}} \]
      11. sub0-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-\left(y - 1\right)}} \]
      12. neg-mul-1100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-1 \cdot \left(y - 1\right)}} \]
      13. times-frac100.0%

        \[\leadsto \color{blue}{\frac{-1}{-1} \cdot \frac{y - x}{y - 1}} \]
      14. metadata-eval100.0%

        \[\leadsto \color{blue}{1} \cdot \frac{y - x}{y - 1} \]
      15. *-lft-identity100.0%

        \[\leadsto \color{blue}{\frac{y - x}{y - 1}} \]
      16. sub-neg100.0%

        \[\leadsto \frac{y - x}{\color{blue}{y + \left(-1\right)}} \]
      17. metadata-eval100.0%

        \[\leadsto \frac{y - x}{y + \color{blue}{-1}} \]
    3. Simplified100.0%

      \[\leadsto \color{blue}{\frac{y - x}{y + -1}} \]
    4. Taylor expanded in y around inf 98.0%

      \[\leadsto \color{blue}{\frac{1}{y} + \left(1 + -1 \cdot \frac{x}{y}\right)} \]
    5. Step-by-step derivation
      1. +-commutative98.0%

        \[\leadsto \frac{1}{y} + \color{blue}{\left(-1 \cdot \frac{x}{y} + 1\right)} \]
      2. associate-+r+98.0%

        \[\leadsto \color{blue}{\left(\frac{1}{y} + -1 \cdot \frac{x}{y}\right) + 1} \]
      3. mul-1-neg98.0%

        \[\leadsto \left(\frac{1}{y} + \color{blue}{\left(-\frac{x}{y}\right)}\right) + 1 \]
      4. unsub-neg98.0%

        \[\leadsto \color{blue}{\left(\frac{1}{y} - \frac{x}{y}\right)} + 1 \]
      5. div-sub98.0%

        \[\leadsto \color{blue}{\frac{1 - x}{y}} + 1 \]
      6. unsub-neg98.0%

        \[\leadsto \frac{\color{blue}{1 + \left(-x\right)}}{y} + 1 \]
      7. mul-1-neg98.0%

        \[\leadsto \frac{1 + \color{blue}{-1 \cdot x}}{y} + 1 \]
      8. +-commutative98.0%

        \[\leadsto \frac{\color{blue}{-1 \cdot x + 1}}{y} + 1 \]
      9. metadata-eval98.0%

        \[\leadsto \frac{-1 \cdot x + \color{blue}{-1 \cdot -1}}{y} + 1 \]
      10. distribute-lft-in98.0%

        \[\leadsto \frac{\color{blue}{-1 \cdot \left(x + -1\right)}}{y} + 1 \]
      11. metadata-eval98.0%

        \[\leadsto \frac{-1 \cdot \left(x + \color{blue}{\left(-1\right)}\right)}{y} + 1 \]
      12. sub-neg98.0%

        \[\leadsto \frac{-1 \cdot \color{blue}{\left(x - 1\right)}}{y} + 1 \]
      13. associate-*r/98.0%

        \[\leadsto \color{blue}{-1 \cdot \frac{x - 1}{y}} + 1 \]
      14. +-commutative98.0%

        \[\leadsto \color{blue}{1 + -1 \cdot \frac{x - 1}{y}} \]
      15. associate-*r/98.0%

        \[\leadsto 1 + \color{blue}{\frac{-1 \cdot \left(x - 1\right)}{y}} \]
      16. sub-neg98.0%

        \[\leadsto 1 + \frac{-1 \cdot \color{blue}{\left(x + \left(-1\right)\right)}}{y} \]
      17. metadata-eval98.0%

        \[\leadsto 1 + \frac{-1 \cdot \left(x + \color{blue}{-1}\right)}{y} \]
      18. distribute-lft-in98.0%

        \[\leadsto 1 + \frac{\color{blue}{-1 \cdot x + -1 \cdot -1}}{y} \]
      19. metadata-eval98.0%

        \[\leadsto 1 + \frac{-1 \cdot x + \color{blue}{1}}{y} \]
      20. +-commutative98.0%

        \[\leadsto 1 + \frac{\color{blue}{1 + -1 \cdot x}}{y} \]
      21. mul-1-neg98.0%

        \[\leadsto 1 + \frac{1 + \color{blue}{\left(-x\right)}}{y} \]
      22. unsub-neg98.0%

        \[\leadsto 1 + \frac{\color{blue}{1 - x}}{y} \]
    6. Simplified98.0%

      \[\leadsto \color{blue}{1 + \frac{1 - x}{y}} \]

    if -1 < y < 1

    1. Initial program 100.0%

      \[\frac{x - y}{1 - y} \]
    2. Step-by-step derivation
      1. sub-neg100.0%

        \[\leadsto \frac{\color{blue}{x + \left(-y\right)}}{1 - y} \]
      2. +-commutative100.0%

        \[\leadsto \frac{\color{blue}{\left(-y\right) + x}}{1 - y} \]
      3. neg-sub0100.0%

        \[\leadsto \frac{\color{blue}{\left(0 - y\right)} + x}{1 - y} \]
      4. associate-+l-100.0%

        \[\leadsto \frac{\color{blue}{0 - \left(y - x\right)}}{1 - y} \]
      5. sub0-neg100.0%

        \[\leadsto \frac{\color{blue}{-\left(y - x\right)}}{1 - y} \]
      6. neg-mul-1100.0%

        \[\leadsto \frac{\color{blue}{-1 \cdot \left(y - x\right)}}{1 - y} \]
      7. sub-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{1 + \left(-y\right)}} \]
      8. +-commutative100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(-y\right) + 1}} \]
      9. neg-sub0100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(0 - y\right)} + 1} \]
      10. associate-+l-100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{0 - \left(y - 1\right)}} \]
      11. sub0-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-\left(y - 1\right)}} \]
      12. neg-mul-1100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-1 \cdot \left(y - 1\right)}} \]
      13. times-frac100.0%

        \[\leadsto \color{blue}{\frac{-1}{-1} \cdot \frac{y - x}{y - 1}} \]
      14. metadata-eval100.0%

        \[\leadsto \color{blue}{1} \cdot \frac{y - x}{y - 1} \]
      15. *-lft-identity100.0%

        \[\leadsto \color{blue}{\frac{y - x}{y - 1}} \]
      16. sub-neg100.0%

        \[\leadsto \frac{y - x}{\color{blue}{y + \left(-1\right)}} \]
      17. metadata-eval100.0%

        \[\leadsto \frac{y - x}{y + \color{blue}{-1}} \]
    3. Simplified100.0%

      \[\leadsto \color{blue}{\frac{y - x}{y + -1}} \]
    4. Taylor expanded in y around 0 99.4%

      \[\leadsto \color{blue}{-1 \cdot \left(y \cdot \left(1 + -1 \cdot x\right)\right) + \left(-1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right) + x\right)} \]
    5. Step-by-step derivation
      1. associate-+r+99.4%

        \[\leadsto \color{blue}{\left(-1 \cdot \left(y \cdot \left(1 + -1 \cdot x\right)\right) + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right) + x} \]
      2. +-commutative99.4%

        \[\leadsto \color{blue}{x + \left(-1 \cdot \left(y \cdot \left(1 + -1 \cdot x\right)\right) + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right)} \]
      3. associate-*r*99.4%

        \[\leadsto x + \left(\color{blue}{\left(-1 \cdot y\right) \cdot \left(1 + -1 \cdot x\right)} + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right) \]
      4. neg-mul-199.4%

        \[\leadsto x + \left(\color{blue}{\left(-y\right)} \cdot \left(1 + -1 \cdot x\right) + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right) \]
      5. associate-*r*99.4%

        \[\leadsto x + \left(\left(-y\right) \cdot \left(1 + -1 \cdot x\right) + \color{blue}{\left(-1 \cdot {y}^{2}\right) \cdot \left(1 + -1 \cdot x\right)}\right) \]
      6. distribute-rgt-out99.4%

        \[\leadsto x + \color{blue}{\left(1 + -1 \cdot x\right) \cdot \left(\left(-y\right) + -1 \cdot {y}^{2}\right)} \]
      7. mul-1-neg99.4%

        \[\leadsto x + \left(1 + \color{blue}{\left(-x\right)}\right) \cdot \left(\left(-y\right) + -1 \cdot {y}^{2}\right) \]
      8. unsub-neg99.4%

        \[\leadsto x + \color{blue}{\left(1 - x\right)} \cdot \left(\left(-y\right) + -1 \cdot {y}^{2}\right) \]
      9. mul-1-neg99.4%

        \[\leadsto x + \left(1 - x\right) \cdot \left(\left(-y\right) + \color{blue}{\left(-{y}^{2}\right)}\right) \]
      10. unsub-neg99.4%

        \[\leadsto x + \left(1 - x\right) \cdot \color{blue}{\left(\left(-y\right) - {y}^{2}\right)} \]
      11. unpow299.4%

        \[\leadsto x + \left(1 - x\right) \cdot \left(\left(-y\right) - \color{blue}{y \cdot y}\right) \]
    6. Simplified99.4%

      \[\leadsto \color{blue}{x + \left(1 - x\right) \cdot \left(\left(-y\right) - y \cdot y\right)} \]
    7. Taylor expanded in x around 0 98.7%

      \[\leadsto x + \color{blue}{-1 \cdot \left(y + {y}^{2}\right)} \]
    8. Step-by-step derivation
      1. unpow298.7%

        \[\leadsto x + -1 \cdot \left(y + \color{blue}{y \cdot y}\right) \]
      2. distribute-lft-in98.7%

        \[\leadsto x + \color{blue}{\left(-1 \cdot y + -1 \cdot \left(y \cdot y\right)\right)} \]
      3. mul-1-neg98.7%

        \[\leadsto x + \left(-1 \cdot y + \color{blue}{\left(-y \cdot y\right)}\right) \]
      4. distribute-lft-neg-in98.7%

        \[\leadsto x + \left(-1 \cdot y + \color{blue}{\left(-y\right) \cdot y}\right) \]
      5. mul-1-neg98.7%

        \[\leadsto x + \left(-1 \cdot y + \color{blue}{\left(-1 \cdot y\right)} \cdot y\right) \]
      6. distribute-rgt-in98.7%

        \[\leadsto x + \color{blue}{y \cdot \left(-1 + -1 \cdot y\right)} \]
      7. mul-1-neg98.7%

        \[\leadsto x + y \cdot \left(-1 + \color{blue}{\left(-y\right)}\right) \]
      8. sub-neg98.7%

        \[\leadsto x + y \cdot \color{blue}{\left(-1 - y\right)} \]
    9. Simplified98.7%

      \[\leadsto x + \color{blue}{y \cdot \left(-1 - y\right)} \]
  3. Recombined 2 regimes into one program.
  4. Final simplification98.4%

    \[\leadsto \begin{array}{l} \mathbf{if}\;y \leq -1 \lor \neg \left(y \leq 1\right):\\ \;\;\;\;1 + \frac{1 - x}{y}\\ \mathbf{else}:\\ \;\;\;\;x + y \cdot \left(-1 - y\right)\\ \end{array} \]

Alternative 4: 86.8% accurate, 0.8× speedup?

\[\begin{array}{l} \\ \begin{array}{l} \mathbf{if}\;y \leq -8.5 \cdot 10^{-13} \lor \neg \left(y \leq 6 \cdot 10^{-10}\right):\\ \;\;\;\;\frac{y}{y + -1}\\ \mathbf{else}:\\ \;\;\;\;x - y\\ \end{array} \end{array} \]
(FPCore (x y)
 :precision binary64
 (if (or (<= y -8.5e-13) (not (<= y 6e-10))) (/ y (+ y -1.0)) (- x y)))
double code(double x, double y) {
	double tmp;
	if ((y <= -8.5e-13) || !(y <= 6e-10)) {
		tmp = y / (y + -1.0);
	} else {
		tmp = x - y;
	}
	return tmp;
}
real(8) function code(x, y)
    real(8), intent (in) :: x
    real(8), intent (in) :: y
    real(8) :: tmp
    if ((y <= (-8.5d-13)) .or. (.not. (y <= 6d-10))) then
        tmp = y / (y + (-1.0d0))
    else
        tmp = x - y
    end if
    code = tmp
end function
public static double code(double x, double y) {
	double tmp;
	if ((y <= -8.5e-13) || !(y <= 6e-10)) {
		tmp = y / (y + -1.0);
	} else {
		tmp = x - y;
	}
	return tmp;
}
def code(x, y):
	tmp = 0
	if (y <= -8.5e-13) or not (y <= 6e-10):
		tmp = y / (y + -1.0)
	else:
		tmp = x - y
	return tmp
function code(x, y)
	tmp = 0.0
	if ((y <= -8.5e-13) || !(y <= 6e-10))
		tmp = Float64(y / Float64(y + -1.0));
	else
		tmp = Float64(x - y);
	end
	return tmp
end
function tmp_2 = code(x, y)
	tmp = 0.0;
	if ((y <= -8.5e-13) || ~((y <= 6e-10)))
		tmp = y / (y + -1.0);
	else
		tmp = x - y;
	end
	tmp_2 = tmp;
end
code[x_, y_] := If[Or[LessEqual[y, -8.5e-13], N[Not[LessEqual[y, 6e-10]], $MachinePrecision]], N[(y / N[(y + -1.0), $MachinePrecision]), $MachinePrecision], N[(x - y), $MachinePrecision]]
\begin{array}{l}

\\
\begin{array}{l}
\mathbf{if}\;y \leq -8.5 \cdot 10^{-13} \lor \neg \left(y \leq 6 \cdot 10^{-10}\right):\\
\;\;\;\;\frac{y}{y + -1}\\

\mathbf{else}:\\
\;\;\;\;x - y\\


\end{array}
\end{array}
Derivation
  1. Split input into 2 regimes
  2. if y < -8.5000000000000001e-13 or 6e-10 < y

    1. Initial program 100.0%

      \[\frac{x - y}{1 - y} \]
    2. Step-by-step derivation
      1. sub-neg100.0%

        \[\leadsto \frac{\color{blue}{x + \left(-y\right)}}{1 - y} \]
      2. +-commutative100.0%

        \[\leadsto \frac{\color{blue}{\left(-y\right) + x}}{1 - y} \]
      3. neg-sub0100.0%

        \[\leadsto \frac{\color{blue}{\left(0 - y\right)} + x}{1 - y} \]
      4. associate-+l-100.0%

        \[\leadsto \frac{\color{blue}{0 - \left(y - x\right)}}{1 - y} \]
      5. sub0-neg100.0%

        \[\leadsto \frac{\color{blue}{-\left(y - x\right)}}{1 - y} \]
      6. neg-mul-1100.0%

        \[\leadsto \frac{\color{blue}{-1 \cdot \left(y - x\right)}}{1 - y} \]
      7. sub-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{1 + \left(-y\right)}} \]
      8. +-commutative100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(-y\right) + 1}} \]
      9. neg-sub0100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(0 - y\right)} + 1} \]
      10. associate-+l-100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{0 - \left(y - 1\right)}} \]
      11. sub0-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-\left(y - 1\right)}} \]
      12. neg-mul-1100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-1 \cdot \left(y - 1\right)}} \]
      13. times-frac100.0%

        \[\leadsto \color{blue}{\frac{-1}{-1} \cdot \frac{y - x}{y - 1}} \]
      14. metadata-eval100.0%

        \[\leadsto \color{blue}{1} \cdot \frac{y - x}{y - 1} \]
      15. *-lft-identity100.0%

        \[\leadsto \color{blue}{\frac{y - x}{y - 1}} \]
      16. sub-neg100.0%

        \[\leadsto \frac{y - x}{\color{blue}{y + \left(-1\right)}} \]
      17. metadata-eval100.0%

        \[\leadsto \frac{y - x}{y + \color{blue}{-1}} \]
    3. Simplified100.0%

      \[\leadsto \color{blue}{\frac{y - x}{y + -1}} \]
    4. Taylor expanded in x around 0 75.4%

      \[\leadsto \color{blue}{\frac{y}{y - 1}} \]

    if -8.5000000000000001e-13 < y < 6e-10

    1. Initial program 100.0%

      \[\frac{x - y}{1 - y} \]
    2. Step-by-step derivation
      1. sub-neg100.0%

        \[\leadsto \frac{\color{blue}{x + \left(-y\right)}}{1 - y} \]
      2. +-commutative100.0%

        \[\leadsto \frac{\color{blue}{\left(-y\right) + x}}{1 - y} \]
      3. neg-sub0100.0%

        \[\leadsto \frac{\color{blue}{\left(0 - y\right)} + x}{1 - y} \]
      4. associate-+l-100.0%

        \[\leadsto \frac{\color{blue}{0 - \left(y - x\right)}}{1 - y} \]
      5. sub0-neg100.0%

        \[\leadsto \frac{\color{blue}{-\left(y - x\right)}}{1 - y} \]
      6. neg-mul-1100.0%

        \[\leadsto \frac{\color{blue}{-1 \cdot \left(y - x\right)}}{1 - y} \]
      7. sub-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{1 + \left(-y\right)}} \]
      8. +-commutative100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(-y\right) + 1}} \]
      9. neg-sub0100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(0 - y\right)} + 1} \]
      10. associate-+l-100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{0 - \left(y - 1\right)}} \]
      11. sub0-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-\left(y - 1\right)}} \]
      12. neg-mul-1100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-1 \cdot \left(y - 1\right)}} \]
      13. times-frac100.0%

        \[\leadsto \color{blue}{\frac{-1}{-1} \cdot \frac{y - x}{y - 1}} \]
      14. metadata-eval100.0%

        \[\leadsto \color{blue}{1} \cdot \frac{y - x}{y - 1} \]
      15. *-lft-identity100.0%

        \[\leadsto \color{blue}{\frac{y - x}{y - 1}} \]
      16. sub-neg100.0%

        \[\leadsto \frac{y - x}{\color{blue}{y + \left(-1\right)}} \]
      17. metadata-eval100.0%

        \[\leadsto \frac{y - x}{y + \color{blue}{-1}} \]
    3. Simplified100.0%

      \[\leadsto \color{blue}{\frac{y - x}{y + -1}} \]
    4. Taylor expanded in y around 0 100.0%

      \[\leadsto \color{blue}{-1 \cdot \left(y \cdot \left(1 + -1 \cdot x\right)\right) + \left(-1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right) + x\right)} \]
    5. Step-by-step derivation
      1. associate-+r+100.0%

        \[\leadsto \color{blue}{\left(-1 \cdot \left(y \cdot \left(1 + -1 \cdot x\right)\right) + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right) + x} \]
      2. +-commutative100.0%

        \[\leadsto \color{blue}{x + \left(-1 \cdot \left(y \cdot \left(1 + -1 \cdot x\right)\right) + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right)} \]
      3. associate-*r*100.0%

        \[\leadsto x + \left(\color{blue}{\left(-1 \cdot y\right) \cdot \left(1 + -1 \cdot x\right)} + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right) \]
      4. neg-mul-1100.0%

        \[\leadsto x + \left(\color{blue}{\left(-y\right)} \cdot \left(1 + -1 \cdot x\right) + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right) \]
      5. associate-*r*100.0%

        \[\leadsto x + \left(\left(-y\right) \cdot \left(1 + -1 \cdot x\right) + \color{blue}{\left(-1 \cdot {y}^{2}\right) \cdot \left(1 + -1 \cdot x\right)}\right) \]
      6. distribute-rgt-out100.0%

        \[\leadsto x + \color{blue}{\left(1 + -1 \cdot x\right) \cdot \left(\left(-y\right) + -1 \cdot {y}^{2}\right)} \]
      7. mul-1-neg100.0%

        \[\leadsto x + \left(1 + \color{blue}{\left(-x\right)}\right) \cdot \left(\left(-y\right) + -1 \cdot {y}^{2}\right) \]
      8. unsub-neg100.0%

        \[\leadsto x + \color{blue}{\left(1 - x\right)} \cdot \left(\left(-y\right) + -1 \cdot {y}^{2}\right) \]
      9. mul-1-neg100.0%

        \[\leadsto x + \left(1 - x\right) \cdot \left(\left(-y\right) + \color{blue}{\left(-{y}^{2}\right)}\right) \]
      10. unsub-neg100.0%

        \[\leadsto x + \left(1 - x\right) \cdot \color{blue}{\left(\left(-y\right) - {y}^{2}\right)} \]
      11. unpow2100.0%

        \[\leadsto x + \left(1 - x\right) \cdot \left(\left(-y\right) - \color{blue}{y \cdot y}\right) \]
    6. Simplified100.0%

      \[\leadsto \color{blue}{x + \left(1 - x\right) \cdot \left(\left(-y\right) - y \cdot y\right)} \]
    7. Taylor expanded in x around 0 99.7%

      \[\leadsto x + \color{blue}{-1 \cdot \left(y + {y}^{2}\right)} \]
    8. Step-by-step derivation
      1. unpow299.7%

        \[\leadsto x + -1 \cdot \left(y + \color{blue}{y \cdot y}\right) \]
      2. distribute-lft-in99.7%

        \[\leadsto x + \color{blue}{\left(-1 \cdot y + -1 \cdot \left(y \cdot y\right)\right)} \]
      3. mul-1-neg99.7%

        \[\leadsto x + \left(-1 \cdot y + \color{blue}{\left(-y \cdot y\right)}\right) \]
      4. distribute-lft-neg-in99.7%

        \[\leadsto x + \left(-1 \cdot y + \color{blue}{\left(-y\right) \cdot y}\right) \]
      5. mul-1-neg99.7%

        \[\leadsto x + \left(-1 \cdot y + \color{blue}{\left(-1 \cdot y\right)} \cdot y\right) \]
      6. distribute-rgt-in99.7%

        \[\leadsto x + \color{blue}{y \cdot \left(-1 + -1 \cdot y\right)} \]
      7. mul-1-neg99.7%

        \[\leadsto x + y \cdot \left(-1 + \color{blue}{\left(-y\right)}\right) \]
      8. sub-neg99.7%

        \[\leadsto x + y \cdot \color{blue}{\left(-1 - y\right)} \]
    9. Simplified99.7%

      \[\leadsto x + \color{blue}{y \cdot \left(-1 - y\right)} \]
    10. Taylor expanded in y around 0 99.5%

      \[\leadsto \color{blue}{-1 \cdot y + x} \]
    11. Step-by-step derivation
      1. neg-mul-199.5%

        \[\leadsto \color{blue}{\left(-y\right)} + x \]
      2. +-commutative99.5%

        \[\leadsto \color{blue}{x + \left(-y\right)} \]
      3. unsub-neg99.5%

        \[\leadsto \color{blue}{x - y} \]
    12. Simplified99.5%

      \[\leadsto \color{blue}{x - y} \]
  3. Recombined 2 regimes into one program.
  4. Final simplification87.0%

    \[\leadsto \begin{array}{l} \mathbf{if}\;y \leq -8.5 \cdot 10^{-13} \lor \neg \left(y \leq 6 \cdot 10^{-10}\right):\\ \;\;\;\;\frac{y}{y + -1}\\ \mathbf{else}:\\ \;\;\;\;x - y\\ \end{array} \]

Alternative 5: 98.0% accurate, 0.8× speedup?

\[\begin{array}{l} \\ \begin{array}{l} \mathbf{if}\;y \leq -1 \lor \neg \left(y \leq 1\right):\\ \;\;\;\;1 - \frac{x}{y}\\ \mathbf{else}:\\ \;\;\;\;x - y\\ \end{array} \end{array} \]
(FPCore (x y)
 :precision binary64
 (if (or (<= y -1.0) (not (<= y 1.0))) (- 1.0 (/ x y)) (- x y)))
double code(double x, double y) {
	double tmp;
	if ((y <= -1.0) || !(y <= 1.0)) {
		tmp = 1.0 - (x / y);
	} else {
		tmp = x - y;
	}
	return tmp;
}
real(8) function code(x, y)
    real(8), intent (in) :: x
    real(8), intent (in) :: y
    real(8) :: tmp
    if ((y <= (-1.0d0)) .or. (.not. (y <= 1.0d0))) then
        tmp = 1.0d0 - (x / y)
    else
        tmp = x - y
    end if
    code = tmp
end function
public static double code(double x, double y) {
	double tmp;
	if ((y <= -1.0) || !(y <= 1.0)) {
		tmp = 1.0 - (x / y);
	} else {
		tmp = x - y;
	}
	return tmp;
}
def code(x, y):
	tmp = 0
	if (y <= -1.0) or not (y <= 1.0):
		tmp = 1.0 - (x / y)
	else:
		tmp = x - y
	return tmp
function code(x, y)
	tmp = 0.0
	if ((y <= -1.0) || !(y <= 1.0))
		tmp = Float64(1.0 - Float64(x / y));
	else
		tmp = Float64(x - y);
	end
	return tmp
end
function tmp_2 = code(x, y)
	tmp = 0.0;
	if ((y <= -1.0) || ~((y <= 1.0)))
		tmp = 1.0 - (x / y);
	else
		tmp = x - y;
	end
	tmp_2 = tmp;
end
code[x_, y_] := If[Or[LessEqual[y, -1.0], N[Not[LessEqual[y, 1.0]], $MachinePrecision]], N[(1.0 - N[(x / y), $MachinePrecision]), $MachinePrecision], N[(x - y), $MachinePrecision]]
\begin{array}{l}

\\
\begin{array}{l}
\mathbf{if}\;y \leq -1 \lor \neg \left(y \leq 1\right):\\
\;\;\;\;1 - \frac{x}{y}\\

\mathbf{else}:\\
\;\;\;\;x - y\\


\end{array}
\end{array}
Derivation
  1. Split input into 2 regimes
  2. if y < -1 or 1 < y

    1. Initial program 100.0%

      \[\frac{x - y}{1 - y} \]
    2. Step-by-step derivation
      1. sub-neg100.0%

        \[\leadsto \frac{\color{blue}{x + \left(-y\right)}}{1 - y} \]
      2. +-commutative100.0%

        \[\leadsto \frac{\color{blue}{\left(-y\right) + x}}{1 - y} \]
      3. neg-sub0100.0%

        \[\leadsto \frac{\color{blue}{\left(0 - y\right)} + x}{1 - y} \]
      4. associate-+l-100.0%

        \[\leadsto \frac{\color{blue}{0 - \left(y - x\right)}}{1 - y} \]
      5. sub0-neg100.0%

        \[\leadsto \frac{\color{blue}{-\left(y - x\right)}}{1 - y} \]
      6. neg-mul-1100.0%

        \[\leadsto \frac{\color{blue}{-1 \cdot \left(y - x\right)}}{1 - y} \]
      7. sub-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{1 + \left(-y\right)}} \]
      8. +-commutative100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(-y\right) + 1}} \]
      9. neg-sub0100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(0 - y\right)} + 1} \]
      10. associate-+l-100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{0 - \left(y - 1\right)}} \]
      11. sub0-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-\left(y - 1\right)}} \]
      12. neg-mul-1100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-1 \cdot \left(y - 1\right)}} \]
      13. times-frac100.0%

        \[\leadsto \color{blue}{\frac{-1}{-1} \cdot \frac{y - x}{y - 1}} \]
      14. metadata-eval100.0%

        \[\leadsto \color{blue}{1} \cdot \frac{y - x}{y - 1} \]
      15. *-lft-identity100.0%

        \[\leadsto \color{blue}{\frac{y - x}{y - 1}} \]
      16. sub-neg100.0%

        \[\leadsto \frac{y - x}{\color{blue}{y + \left(-1\right)}} \]
      17. metadata-eval100.0%

        \[\leadsto \frac{y - x}{y + \color{blue}{-1}} \]
    3. Simplified100.0%

      \[\leadsto \color{blue}{\frac{y - x}{y + -1}} \]
    4. Taylor expanded in y around inf 98.0%

      \[\leadsto \color{blue}{\frac{1}{y} + \left(1 + -1 \cdot \frac{x}{y}\right)} \]
    5. Step-by-step derivation
      1. +-commutative98.0%

        \[\leadsto \frac{1}{y} + \color{blue}{\left(-1 \cdot \frac{x}{y} + 1\right)} \]
      2. associate-+r+98.0%

        \[\leadsto \color{blue}{\left(\frac{1}{y} + -1 \cdot \frac{x}{y}\right) + 1} \]
      3. mul-1-neg98.0%

        \[\leadsto \left(\frac{1}{y} + \color{blue}{\left(-\frac{x}{y}\right)}\right) + 1 \]
      4. unsub-neg98.0%

        \[\leadsto \color{blue}{\left(\frac{1}{y} - \frac{x}{y}\right)} + 1 \]
      5. div-sub98.0%

        \[\leadsto \color{blue}{\frac{1 - x}{y}} + 1 \]
      6. unsub-neg98.0%

        \[\leadsto \frac{\color{blue}{1 + \left(-x\right)}}{y} + 1 \]
      7. mul-1-neg98.0%

        \[\leadsto \frac{1 + \color{blue}{-1 \cdot x}}{y} + 1 \]
      8. +-commutative98.0%

        \[\leadsto \frac{\color{blue}{-1 \cdot x + 1}}{y} + 1 \]
      9. metadata-eval98.0%

        \[\leadsto \frac{-1 \cdot x + \color{blue}{-1 \cdot -1}}{y} + 1 \]
      10. distribute-lft-in98.0%

        \[\leadsto \frac{\color{blue}{-1 \cdot \left(x + -1\right)}}{y} + 1 \]
      11. metadata-eval98.0%

        \[\leadsto \frac{-1 \cdot \left(x + \color{blue}{\left(-1\right)}\right)}{y} + 1 \]
      12. sub-neg98.0%

        \[\leadsto \frac{-1 \cdot \color{blue}{\left(x - 1\right)}}{y} + 1 \]
      13. associate-*r/98.0%

        \[\leadsto \color{blue}{-1 \cdot \frac{x - 1}{y}} + 1 \]
      14. +-commutative98.0%

        \[\leadsto \color{blue}{1 + -1 \cdot \frac{x - 1}{y}} \]
      15. associate-*r/98.0%

        \[\leadsto 1 + \color{blue}{\frac{-1 \cdot \left(x - 1\right)}{y}} \]
      16. sub-neg98.0%

        \[\leadsto 1 + \frac{-1 \cdot \color{blue}{\left(x + \left(-1\right)\right)}}{y} \]
      17. metadata-eval98.0%

        \[\leadsto 1 + \frac{-1 \cdot \left(x + \color{blue}{-1}\right)}{y} \]
      18. distribute-lft-in98.0%

        \[\leadsto 1 + \frac{\color{blue}{-1 \cdot x + -1 \cdot -1}}{y} \]
      19. metadata-eval98.0%

        \[\leadsto 1 + \frac{-1 \cdot x + \color{blue}{1}}{y} \]
      20. +-commutative98.0%

        \[\leadsto 1 + \frac{\color{blue}{1 + -1 \cdot x}}{y} \]
      21. mul-1-neg98.0%

        \[\leadsto 1 + \frac{1 + \color{blue}{\left(-x\right)}}{y} \]
      22. unsub-neg98.0%

        \[\leadsto 1 + \frac{\color{blue}{1 - x}}{y} \]
    6. Simplified98.0%

      \[\leadsto \color{blue}{1 + \frac{1 - x}{y}} \]
    7. Taylor expanded in x around inf 97.3%

      \[\leadsto 1 + \color{blue}{-1 \cdot \frac{x}{y}} \]
    8. Step-by-step derivation
      1. neg-mul-197.3%

        \[\leadsto 1 + \color{blue}{\left(-\frac{x}{y}\right)} \]
      2. distribute-neg-frac97.3%

        \[\leadsto 1 + \color{blue}{\frac{-x}{y}} \]
    9. Simplified97.3%

      \[\leadsto 1 + \color{blue}{\frac{-x}{y}} \]

    if -1 < y < 1

    1. Initial program 100.0%

      \[\frac{x - y}{1 - y} \]
    2. Step-by-step derivation
      1. sub-neg100.0%

        \[\leadsto \frac{\color{blue}{x + \left(-y\right)}}{1 - y} \]
      2. +-commutative100.0%

        \[\leadsto \frac{\color{blue}{\left(-y\right) + x}}{1 - y} \]
      3. neg-sub0100.0%

        \[\leadsto \frac{\color{blue}{\left(0 - y\right)} + x}{1 - y} \]
      4. associate-+l-100.0%

        \[\leadsto \frac{\color{blue}{0 - \left(y - x\right)}}{1 - y} \]
      5. sub0-neg100.0%

        \[\leadsto \frac{\color{blue}{-\left(y - x\right)}}{1 - y} \]
      6. neg-mul-1100.0%

        \[\leadsto \frac{\color{blue}{-1 \cdot \left(y - x\right)}}{1 - y} \]
      7. sub-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{1 + \left(-y\right)}} \]
      8. +-commutative100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(-y\right) + 1}} \]
      9. neg-sub0100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(0 - y\right)} + 1} \]
      10. associate-+l-100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{0 - \left(y - 1\right)}} \]
      11. sub0-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-\left(y - 1\right)}} \]
      12. neg-mul-1100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-1 \cdot \left(y - 1\right)}} \]
      13. times-frac100.0%

        \[\leadsto \color{blue}{\frac{-1}{-1} \cdot \frac{y - x}{y - 1}} \]
      14. metadata-eval100.0%

        \[\leadsto \color{blue}{1} \cdot \frac{y - x}{y - 1} \]
      15. *-lft-identity100.0%

        \[\leadsto \color{blue}{\frac{y - x}{y - 1}} \]
      16. sub-neg100.0%

        \[\leadsto \frac{y - x}{\color{blue}{y + \left(-1\right)}} \]
      17. metadata-eval100.0%

        \[\leadsto \frac{y - x}{y + \color{blue}{-1}} \]
    3. Simplified100.0%

      \[\leadsto \color{blue}{\frac{y - x}{y + -1}} \]
    4. Taylor expanded in y around 0 99.4%

      \[\leadsto \color{blue}{-1 \cdot \left(y \cdot \left(1 + -1 \cdot x\right)\right) + \left(-1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right) + x\right)} \]
    5. Step-by-step derivation
      1. associate-+r+99.4%

        \[\leadsto \color{blue}{\left(-1 \cdot \left(y \cdot \left(1 + -1 \cdot x\right)\right) + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right) + x} \]
      2. +-commutative99.4%

        \[\leadsto \color{blue}{x + \left(-1 \cdot \left(y \cdot \left(1 + -1 \cdot x\right)\right) + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right)} \]
      3. associate-*r*99.4%

        \[\leadsto x + \left(\color{blue}{\left(-1 \cdot y\right) \cdot \left(1 + -1 \cdot x\right)} + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right) \]
      4. neg-mul-199.4%

        \[\leadsto x + \left(\color{blue}{\left(-y\right)} \cdot \left(1 + -1 \cdot x\right) + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right) \]
      5. associate-*r*99.4%

        \[\leadsto x + \left(\left(-y\right) \cdot \left(1 + -1 \cdot x\right) + \color{blue}{\left(-1 \cdot {y}^{2}\right) \cdot \left(1 + -1 \cdot x\right)}\right) \]
      6. distribute-rgt-out99.4%

        \[\leadsto x + \color{blue}{\left(1 + -1 \cdot x\right) \cdot \left(\left(-y\right) + -1 \cdot {y}^{2}\right)} \]
      7. mul-1-neg99.4%

        \[\leadsto x + \left(1 + \color{blue}{\left(-x\right)}\right) \cdot \left(\left(-y\right) + -1 \cdot {y}^{2}\right) \]
      8. unsub-neg99.4%

        \[\leadsto x + \color{blue}{\left(1 - x\right)} \cdot \left(\left(-y\right) + -1 \cdot {y}^{2}\right) \]
      9. mul-1-neg99.4%

        \[\leadsto x + \left(1 - x\right) \cdot \left(\left(-y\right) + \color{blue}{\left(-{y}^{2}\right)}\right) \]
      10. unsub-neg99.4%

        \[\leadsto x + \left(1 - x\right) \cdot \color{blue}{\left(\left(-y\right) - {y}^{2}\right)} \]
      11. unpow299.4%

        \[\leadsto x + \left(1 - x\right) \cdot \left(\left(-y\right) - \color{blue}{y \cdot y}\right) \]
    6. Simplified99.4%

      \[\leadsto \color{blue}{x + \left(1 - x\right) \cdot \left(\left(-y\right) - y \cdot y\right)} \]
    7. Taylor expanded in x around 0 98.7%

      \[\leadsto x + \color{blue}{-1 \cdot \left(y + {y}^{2}\right)} \]
    8. Step-by-step derivation
      1. unpow298.7%

        \[\leadsto x + -1 \cdot \left(y + \color{blue}{y \cdot y}\right) \]
      2. distribute-lft-in98.7%

        \[\leadsto x + \color{blue}{\left(-1 \cdot y + -1 \cdot \left(y \cdot y\right)\right)} \]
      3. mul-1-neg98.7%

        \[\leadsto x + \left(-1 \cdot y + \color{blue}{\left(-y \cdot y\right)}\right) \]
      4. distribute-lft-neg-in98.7%

        \[\leadsto x + \left(-1 \cdot y + \color{blue}{\left(-y\right) \cdot y}\right) \]
      5. mul-1-neg98.7%

        \[\leadsto x + \left(-1 \cdot y + \color{blue}{\left(-1 \cdot y\right)} \cdot y\right) \]
      6. distribute-rgt-in98.7%

        \[\leadsto x + \color{blue}{y \cdot \left(-1 + -1 \cdot y\right)} \]
      7. mul-1-neg98.7%

        \[\leadsto x + y \cdot \left(-1 + \color{blue}{\left(-y\right)}\right) \]
      8. sub-neg98.7%

        \[\leadsto x + y \cdot \color{blue}{\left(-1 - y\right)} \]
    9. Simplified98.7%

      \[\leadsto x + \color{blue}{y \cdot \left(-1 - y\right)} \]
    10. Taylor expanded in y around 0 97.8%

      \[\leadsto \color{blue}{-1 \cdot y + x} \]
    11. Step-by-step derivation
      1. neg-mul-197.8%

        \[\leadsto \color{blue}{\left(-y\right)} + x \]
      2. +-commutative97.8%

        \[\leadsto \color{blue}{x + \left(-y\right)} \]
      3. unsub-neg97.8%

        \[\leadsto \color{blue}{x - y} \]
    12. Simplified97.8%

      \[\leadsto \color{blue}{x - y} \]
  3. Recombined 2 regimes into one program.
  4. Final simplification97.5%

    \[\leadsto \begin{array}{l} \mathbf{if}\;y \leq -1 \lor \neg \left(y \leq 1\right):\\ \;\;\;\;1 - \frac{x}{y}\\ \mathbf{else}:\\ \;\;\;\;x - y\\ \end{array} \]

Alternative 6: 86.0% accurate, 1.0× speedup?

\[\begin{array}{l} \\ \begin{array}{l} \mathbf{if}\;y \leq -800000000000:\\ \;\;\;\;1\\ \mathbf{elif}\;y \leq 1:\\ \;\;\;\;x - y\\ \mathbf{else}:\\ \;\;\;\;1\\ \end{array} \end{array} \]
(FPCore (x y)
 :precision binary64
 (if (<= y -800000000000.0) 1.0 (if (<= y 1.0) (- x y) 1.0)))
double code(double x, double y) {
	double tmp;
	if (y <= -800000000000.0) {
		tmp = 1.0;
	} else if (y <= 1.0) {
		tmp = x - y;
	} else {
		tmp = 1.0;
	}
	return tmp;
}
real(8) function code(x, y)
    real(8), intent (in) :: x
    real(8), intent (in) :: y
    real(8) :: tmp
    if (y <= (-800000000000.0d0)) then
        tmp = 1.0d0
    else if (y <= 1.0d0) then
        tmp = x - y
    else
        tmp = 1.0d0
    end if
    code = tmp
end function
public static double code(double x, double y) {
	double tmp;
	if (y <= -800000000000.0) {
		tmp = 1.0;
	} else if (y <= 1.0) {
		tmp = x - y;
	} else {
		tmp = 1.0;
	}
	return tmp;
}
def code(x, y):
	tmp = 0
	if y <= -800000000000.0:
		tmp = 1.0
	elif y <= 1.0:
		tmp = x - y
	else:
		tmp = 1.0
	return tmp
function code(x, y)
	tmp = 0.0
	if (y <= -800000000000.0)
		tmp = 1.0;
	elseif (y <= 1.0)
		tmp = Float64(x - y);
	else
		tmp = 1.0;
	end
	return tmp
end
function tmp_2 = code(x, y)
	tmp = 0.0;
	if (y <= -800000000000.0)
		tmp = 1.0;
	elseif (y <= 1.0)
		tmp = x - y;
	else
		tmp = 1.0;
	end
	tmp_2 = tmp;
end
code[x_, y_] := If[LessEqual[y, -800000000000.0], 1.0, If[LessEqual[y, 1.0], N[(x - y), $MachinePrecision], 1.0]]
\begin{array}{l}

\\
\begin{array}{l}
\mathbf{if}\;y \leq -800000000000:\\
\;\;\;\;1\\

\mathbf{elif}\;y \leq 1:\\
\;\;\;\;x - y\\

\mathbf{else}:\\
\;\;\;\;1\\


\end{array}
\end{array}
Derivation
  1. Split input into 2 regimes
  2. if y < -8e11 or 1 < y

    1. Initial program 100.0%

      \[\frac{x - y}{1 - y} \]
    2. Step-by-step derivation
      1. sub-neg100.0%

        \[\leadsto \frac{\color{blue}{x + \left(-y\right)}}{1 - y} \]
      2. +-commutative100.0%

        \[\leadsto \frac{\color{blue}{\left(-y\right) + x}}{1 - y} \]
      3. neg-sub0100.0%

        \[\leadsto \frac{\color{blue}{\left(0 - y\right)} + x}{1 - y} \]
      4. associate-+l-100.0%

        \[\leadsto \frac{\color{blue}{0 - \left(y - x\right)}}{1 - y} \]
      5. sub0-neg100.0%

        \[\leadsto \frac{\color{blue}{-\left(y - x\right)}}{1 - y} \]
      6. neg-mul-1100.0%

        \[\leadsto \frac{\color{blue}{-1 \cdot \left(y - x\right)}}{1 - y} \]
      7. sub-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{1 + \left(-y\right)}} \]
      8. +-commutative100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(-y\right) + 1}} \]
      9. neg-sub0100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(0 - y\right)} + 1} \]
      10. associate-+l-100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{0 - \left(y - 1\right)}} \]
      11. sub0-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-\left(y - 1\right)}} \]
      12. neg-mul-1100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-1 \cdot \left(y - 1\right)}} \]
      13. times-frac100.0%

        \[\leadsto \color{blue}{\frac{-1}{-1} \cdot \frac{y - x}{y - 1}} \]
      14. metadata-eval100.0%

        \[\leadsto \color{blue}{1} \cdot \frac{y - x}{y - 1} \]
      15. *-lft-identity100.0%

        \[\leadsto \color{blue}{\frac{y - x}{y - 1}} \]
      16. sub-neg100.0%

        \[\leadsto \frac{y - x}{\color{blue}{y + \left(-1\right)}} \]
      17. metadata-eval100.0%

        \[\leadsto \frac{y - x}{y + \color{blue}{-1}} \]
    3. Simplified100.0%

      \[\leadsto \color{blue}{\frac{y - x}{y + -1}} \]
    4. Taylor expanded in y around inf 75.6%

      \[\leadsto \color{blue}{1} \]

    if -8e11 < y < 1

    1. Initial program 100.0%

      \[\frac{x - y}{1 - y} \]
    2. Step-by-step derivation
      1. sub-neg100.0%

        \[\leadsto \frac{\color{blue}{x + \left(-y\right)}}{1 - y} \]
      2. +-commutative100.0%

        \[\leadsto \frac{\color{blue}{\left(-y\right) + x}}{1 - y} \]
      3. neg-sub0100.0%

        \[\leadsto \frac{\color{blue}{\left(0 - y\right)} + x}{1 - y} \]
      4. associate-+l-100.0%

        \[\leadsto \frac{\color{blue}{0 - \left(y - x\right)}}{1 - y} \]
      5. sub0-neg100.0%

        \[\leadsto \frac{\color{blue}{-\left(y - x\right)}}{1 - y} \]
      6. neg-mul-1100.0%

        \[\leadsto \frac{\color{blue}{-1 \cdot \left(y - x\right)}}{1 - y} \]
      7. sub-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{1 + \left(-y\right)}} \]
      8. +-commutative100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(-y\right) + 1}} \]
      9. neg-sub0100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(0 - y\right)} + 1} \]
      10. associate-+l-100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{0 - \left(y - 1\right)}} \]
      11. sub0-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-\left(y - 1\right)}} \]
      12. neg-mul-1100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-1 \cdot \left(y - 1\right)}} \]
      13. times-frac100.0%

        \[\leadsto \color{blue}{\frac{-1}{-1} \cdot \frac{y - x}{y - 1}} \]
      14. metadata-eval100.0%

        \[\leadsto \color{blue}{1} \cdot \frac{y - x}{y - 1} \]
      15. *-lft-identity100.0%

        \[\leadsto \color{blue}{\frac{y - x}{y - 1}} \]
      16. sub-neg100.0%

        \[\leadsto \frac{y - x}{\color{blue}{y + \left(-1\right)}} \]
      17. metadata-eval100.0%

        \[\leadsto \frac{y - x}{y + \color{blue}{-1}} \]
    3. Simplified100.0%

      \[\leadsto \color{blue}{\frac{y - x}{y + -1}} \]
    4. Taylor expanded in y around 0 97.3%

      \[\leadsto \color{blue}{-1 \cdot \left(y \cdot \left(1 + -1 \cdot x\right)\right) + \left(-1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right) + x\right)} \]
    5. Step-by-step derivation
      1. associate-+r+97.3%

        \[\leadsto \color{blue}{\left(-1 \cdot \left(y \cdot \left(1 + -1 \cdot x\right)\right) + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right) + x} \]
      2. +-commutative97.3%

        \[\leadsto \color{blue}{x + \left(-1 \cdot \left(y \cdot \left(1 + -1 \cdot x\right)\right) + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right)} \]
      3. associate-*r*97.3%

        \[\leadsto x + \left(\color{blue}{\left(-1 \cdot y\right) \cdot \left(1 + -1 \cdot x\right)} + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right) \]
      4. neg-mul-197.3%

        \[\leadsto x + \left(\color{blue}{\left(-y\right)} \cdot \left(1 + -1 \cdot x\right) + -1 \cdot \left({y}^{2} \cdot \left(1 + -1 \cdot x\right)\right)\right) \]
      5. associate-*r*97.3%

        \[\leadsto x + \left(\left(-y\right) \cdot \left(1 + -1 \cdot x\right) + \color{blue}{\left(-1 \cdot {y}^{2}\right) \cdot \left(1 + -1 \cdot x\right)}\right) \]
      6. distribute-rgt-out97.3%

        \[\leadsto x + \color{blue}{\left(1 + -1 \cdot x\right) \cdot \left(\left(-y\right) + -1 \cdot {y}^{2}\right)} \]
      7. mul-1-neg97.3%

        \[\leadsto x + \left(1 + \color{blue}{\left(-x\right)}\right) \cdot \left(\left(-y\right) + -1 \cdot {y}^{2}\right) \]
      8. unsub-neg97.3%

        \[\leadsto x + \color{blue}{\left(1 - x\right)} \cdot \left(\left(-y\right) + -1 \cdot {y}^{2}\right) \]
      9. mul-1-neg97.3%

        \[\leadsto x + \left(1 - x\right) \cdot \left(\left(-y\right) + \color{blue}{\left(-{y}^{2}\right)}\right) \]
      10. unsub-neg97.3%

        \[\leadsto x + \left(1 - x\right) \cdot \color{blue}{\left(\left(-y\right) - {y}^{2}\right)} \]
      11. unpow297.3%

        \[\leadsto x + \left(1 - x\right) \cdot \left(\left(-y\right) - \color{blue}{y \cdot y}\right) \]
    6. Simplified97.3%

      \[\leadsto \color{blue}{x + \left(1 - x\right) \cdot \left(\left(-y\right) - y \cdot y\right)} \]
    7. Taylor expanded in x around 0 96.7%

      \[\leadsto x + \color{blue}{-1 \cdot \left(y + {y}^{2}\right)} \]
    8. Step-by-step derivation
      1. unpow296.7%

        \[\leadsto x + -1 \cdot \left(y + \color{blue}{y \cdot y}\right) \]
      2. distribute-lft-in96.7%

        \[\leadsto x + \color{blue}{\left(-1 \cdot y + -1 \cdot \left(y \cdot y\right)\right)} \]
      3. mul-1-neg96.7%

        \[\leadsto x + \left(-1 \cdot y + \color{blue}{\left(-y \cdot y\right)}\right) \]
      4. distribute-lft-neg-in96.7%

        \[\leadsto x + \left(-1 \cdot y + \color{blue}{\left(-y\right) \cdot y}\right) \]
      5. mul-1-neg96.7%

        \[\leadsto x + \left(-1 \cdot y + \color{blue}{\left(-1 \cdot y\right)} \cdot y\right) \]
      6. distribute-rgt-in96.7%

        \[\leadsto x + \color{blue}{y \cdot \left(-1 + -1 \cdot y\right)} \]
      7. mul-1-neg96.7%

        \[\leadsto x + y \cdot \left(-1 + \color{blue}{\left(-y\right)}\right) \]
      8. sub-neg96.7%

        \[\leadsto x + y \cdot \color{blue}{\left(-1 - y\right)} \]
    9. Simplified96.7%

      \[\leadsto x + \color{blue}{y \cdot \left(-1 - y\right)} \]
    10. Taylor expanded in y around 0 95.8%

      \[\leadsto \color{blue}{-1 \cdot y + x} \]
    11. Step-by-step derivation
      1. neg-mul-195.8%

        \[\leadsto \color{blue}{\left(-y\right)} + x \]
      2. +-commutative95.8%

        \[\leadsto \color{blue}{x + \left(-y\right)} \]
      3. unsub-neg95.8%

        \[\leadsto \color{blue}{x - y} \]
    12. Simplified95.8%

      \[\leadsto \color{blue}{x - y} \]
  3. Recombined 2 regimes into one program.
  4. Final simplification85.9%

    \[\leadsto \begin{array}{l} \mathbf{if}\;y \leq -800000000000:\\ \;\;\;\;1\\ \mathbf{elif}\;y \leq 1:\\ \;\;\;\;x - y\\ \mathbf{else}:\\ \;\;\;\;1\\ \end{array} \]

Alternative 7: 74.7% accurate, 1.4× speedup?

\[\begin{array}{l} \\ \begin{array}{l} \mathbf{if}\;y \leq -800000000000:\\ \;\;\;\;1\\ \mathbf{elif}\;y \leq 1:\\ \;\;\;\;x\\ \mathbf{else}:\\ \;\;\;\;1\\ \end{array} \end{array} \]
(FPCore (x y)
 :precision binary64
 (if (<= y -800000000000.0) 1.0 (if (<= y 1.0) x 1.0)))
double code(double x, double y) {
	double tmp;
	if (y <= -800000000000.0) {
		tmp = 1.0;
	} else if (y <= 1.0) {
		tmp = x;
	} else {
		tmp = 1.0;
	}
	return tmp;
}
real(8) function code(x, y)
    real(8), intent (in) :: x
    real(8), intent (in) :: y
    real(8) :: tmp
    if (y <= (-800000000000.0d0)) then
        tmp = 1.0d0
    else if (y <= 1.0d0) then
        tmp = x
    else
        tmp = 1.0d0
    end if
    code = tmp
end function
public static double code(double x, double y) {
	double tmp;
	if (y <= -800000000000.0) {
		tmp = 1.0;
	} else if (y <= 1.0) {
		tmp = x;
	} else {
		tmp = 1.0;
	}
	return tmp;
}
def code(x, y):
	tmp = 0
	if y <= -800000000000.0:
		tmp = 1.0
	elif y <= 1.0:
		tmp = x
	else:
		tmp = 1.0
	return tmp
function code(x, y)
	tmp = 0.0
	if (y <= -800000000000.0)
		tmp = 1.0;
	elseif (y <= 1.0)
		tmp = x;
	else
		tmp = 1.0;
	end
	return tmp
end
function tmp_2 = code(x, y)
	tmp = 0.0;
	if (y <= -800000000000.0)
		tmp = 1.0;
	elseif (y <= 1.0)
		tmp = x;
	else
		tmp = 1.0;
	end
	tmp_2 = tmp;
end
code[x_, y_] := If[LessEqual[y, -800000000000.0], 1.0, If[LessEqual[y, 1.0], x, 1.0]]
\begin{array}{l}

\\
\begin{array}{l}
\mathbf{if}\;y \leq -800000000000:\\
\;\;\;\;1\\

\mathbf{elif}\;y \leq 1:\\
\;\;\;\;x\\

\mathbf{else}:\\
\;\;\;\;1\\


\end{array}
\end{array}
Derivation
  1. Split input into 2 regimes
  2. if y < -8e11 or 1 < y

    1. Initial program 100.0%

      \[\frac{x - y}{1 - y} \]
    2. Step-by-step derivation
      1. sub-neg100.0%

        \[\leadsto \frac{\color{blue}{x + \left(-y\right)}}{1 - y} \]
      2. +-commutative100.0%

        \[\leadsto \frac{\color{blue}{\left(-y\right) + x}}{1 - y} \]
      3. neg-sub0100.0%

        \[\leadsto \frac{\color{blue}{\left(0 - y\right)} + x}{1 - y} \]
      4. associate-+l-100.0%

        \[\leadsto \frac{\color{blue}{0 - \left(y - x\right)}}{1 - y} \]
      5. sub0-neg100.0%

        \[\leadsto \frac{\color{blue}{-\left(y - x\right)}}{1 - y} \]
      6. neg-mul-1100.0%

        \[\leadsto \frac{\color{blue}{-1 \cdot \left(y - x\right)}}{1 - y} \]
      7. sub-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{1 + \left(-y\right)}} \]
      8. +-commutative100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(-y\right) + 1}} \]
      9. neg-sub0100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(0 - y\right)} + 1} \]
      10. associate-+l-100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{0 - \left(y - 1\right)}} \]
      11. sub0-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-\left(y - 1\right)}} \]
      12. neg-mul-1100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-1 \cdot \left(y - 1\right)}} \]
      13. times-frac100.0%

        \[\leadsto \color{blue}{\frac{-1}{-1} \cdot \frac{y - x}{y - 1}} \]
      14. metadata-eval100.0%

        \[\leadsto \color{blue}{1} \cdot \frac{y - x}{y - 1} \]
      15. *-lft-identity100.0%

        \[\leadsto \color{blue}{\frac{y - x}{y - 1}} \]
      16. sub-neg100.0%

        \[\leadsto \frac{y - x}{\color{blue}{y + \left(-1\right)}} \]
      17. metadata-eval100.0%

        \[\leadsto \frac{y - x}{y + \color{blue}{-1}} \]
    3. Simplified100.0%

      \[\leadsto \color{blue}{\frac{y - x}{y + -1}} \]
    4. Taylor expanded in y around inf 75.6%

      \[\leadsto \color{blue}{1} \]

    if -8e11 < y < 1

    1. Initial program 100.0%

      \[\frac{x - y}{1 - y} \]
    2. Step-by-step derivation
      1. sub-neg100.0%

        \[\leadsto \frac{\color{blue}{x + \left(-y\right)}}{1 - y} \]
      2. +-commutative100.0%

        \[\leadsto \frac{\color{blue}{\left(-y\right) + x}}{1 - y} \]
      3. neg-sub0100.0%

        \[\leadsto \frac{\color{blue}{\left(0 - y\right)} + x}{1 - y} \]
      4. associate-+l-100.0%

        \[\leadsto \frac{\color{blue}{0 - \left(y - x\right)}}{1 - y} \]
      5. sub0-neg100.0%

        \[\leadsto \frac{\color{blue}{-\left(y - x\right)}}{1 - y} \]
      6. neg-mul-1100.0%

        \[\leadsto \frac{\color{blue}{-1 \cdot \left(y - x\right)}}{1 - y} \]
      7. sub-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{1 + \left(-y\right)}} \]
      8. +-commutative100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(-y\right) + 1}} \]
      9. neg-sub0100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(0 - y\right)} + 1} \]
      10. associate-+l-100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{0 - \left(y - 1\right)}} \]
      11. sub0-neg100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-\left(y - 1\right)}} \]
      12. neg-mul-1100.0%

        \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-1 \cdot \left(y - 1\right)}} \]
      13. times-frac100.0%

        \[\leadsto \color{blue}{\frac{-1}{-1} \cdot \frac{y - x}{y - 1}} \]
      14. metadata-eval100.0%

        \[\leadsto \color{blue}{1} \cdot \frac{y - x}{y - 1} \]
      15. *-lft-identity100.0%

        \[\leadsto \color{blue}{\frac{y - x}{y - 1}} \]
      16. sub-neg100.0%

        \[\leadsto \frac{y - x}{\color{blue}{y + \left(-1\right)}} \]
      17. metadata-eval100.0%

        \[\leadsto \frac{y - x}{y + \color{blue}{-1}} \]
    3. Simplified100.0%

      \[\leadsto \color{blue}{\frac{y - x}{y + -1}} \]
    4. Taylor expanded in y around 0 70.0%

      \[\leadsto \color{blue}{x} \]
  3. Recombined 2 regimes into one program.
  4. Final simplification72.8%

    \[\leadsto \begin{array}{l} \mathbf{if}\;y \leq -800000000000:\\ \;\;\;\;1\\ \mathbf{elif}\;y \leq 1:\\ \;\;\;\;x\\ \mathbf{else}:\\ \;\;\;\;1\\ \end{array} \]

Alternative 8: 38.6% accurate, 7.0× speedup?

\[\begin{array}{l} \\ 1 \end{array} \]
(FPCore (x y) :precision binary64 1.0)
double code(double x, double y) {
	return 1.0;
}
real(8) function code(x, y)
    real(8), intent (in) :: x
    real(8), intent (in) :: y
    code = 1.0d0
end function
public static double code(double x, double y) {
	return 1.0;
}
def code(x, y):
	return 1.0
function code(x, y)
	return 1.0
end
function tmp = code(x, y)
	tmp = 1.0;
end
code[x_, y_] := 1.0
\begin{array}{l}

\\
1
\end{array}
Derivation
  1. Initial program 100.0%

    \[\frac{x - y}{1 - y} \]
  2. Step-by-step derivation
    1. sub-neg100.0%

      \[\leadsto \frac{\color{blue}{x + \left(-y\right)}}{1 - y} \]
    2. +-commutative100.0%

      \[\leadsto \frac{\color{blue}{\left(-y\right) + x}}{1 - y} \]
    3. neg-sub0100.0%

      \[\leadsto \frac{\color{blue}{\left(0 - y\right)} + x}{1 - y} \]
    4. associate-+l-100.0%

      \[\leadsto \frac{\color{blue}{0 - \left(y - x\right)}}{1 - y} \]
    5. sub0-neg100.0%

      \[\leadsto \frac{\color{blue}{-\left(y - x\right)}}{1 - y} \]
    6. neg-mul-1100.0%

      \[\leadsto \frac{\color{blue}{-1 \cdot \left(y - x\right)}}{1 - y} \]
    7. sub-neg100.0%

      \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{1 + \left(-y\right)}} \]
    8. +-commutative100.0%

      \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(-y\right) + 1}} \]
    9. neg-sub0100.0%

      \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{\left(0 - y\right)} + 1} \]
    10. associate-+l-100.0%

      \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{0 - \left(y - 1\right)}} \]
    11. sub0-neg100.0%

      \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-\left(y - 1\right)}} \]
    12. neg-mul-1100.0%

      \[\leadsto \frac{-1 \cdot \left(y - x\right)}{\color{blue}{-1 \cdot \left(y - 1\right)}} \]
    13. times-frac100.0%

      \[\leadsto \color{blue}{\frac{-1}{-1} \cdot \frac{y - x}{y - 1}} \]
    14. metadata-eval100.0%

      \[\leadsto \color{blue}{1} \cdot \frac{y - x}{y - 1} \]
    15. *-lft-identity100.0%

      \[\leadsto \color{blue}{\frac{y - x}{y - 1}} \]
    16. sub-neg100.0%

      \[\leadsto \frac{y - x}{\color{blue}{y + \left(-1\right)}} \]
    17. metadata-eval100.0%

      \[\leadsto \frac{y - x}{y + \color{blue}{-1}} \]
  3. Simplified100.0%

    \[\leadsto \color{blue}{\frac{y - x}{y + -1}} \]
  4. Taylor expanded in y around inf 38.7%

    \[\leadsto \color{blue}{1} \]
  5. Final simplification38.7%

    \[\leadsto 1 \]

Reproduce

?
herbie shell --seed 2023258 
(FPCore (x y)
  :name "Diagrams.Trail:splitAtParam  from diagrams-lib-1.3.0.3, C"
  :precision binary64
  (/ (- x y) (- 1.0 y)))