Given's Rotation SVD example, simplified

Percentage Accurate: 75.5% → 99.8%
Time: 8.2s
Alternatives: 15
Speedup: 2.0×

Specification

?
\[\begin{array}{l} \\ 1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \end{array} \]
(FPCore (x)
 :precision binary64
 (- 1.0 (sqrt (* 0.5 (+ 1.0 (/ 1.0 (hypot 1.0 x)))))))
double code(double x) {
	return 1.0 - sqrt((0.5 * (1.0 + (1.0 / hypot(1.0, x)))));
}
public static double code(double x) {
	return 1.0 - Math.sqrt((0.5 * (1.0 + (1.0 / Math.hypot(1.0, x)))));
}
def code(x):
	return 1.0 - math.sqrt((0.5 * (1.0 + (1.0 / math.hypot(1.0, x)))))
function code(x)
	return Float64(1.0 - sqrt(Float64(0.5 * Float64(1.0 + Float64(1.0 / hypot(1.0, x))))))
end
function tmp = code(x)
	tmp = 1.0 - sqrt((0.5 * (1.0 + (1.0 / hypot(1.0, x)))));
end
code[x_] := N[(1.0 - N[Sqrt[N[(0.5 * N[(1.0 + N[(1.0 / N[Sqrt[1.0 ^ 2 + x ^ 2], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]), $MachinePrecision]], $MachinePrecision]), $MachinePrecision]
\begin{array}{l}

\\
1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)}
\end{array}

Sampling outcomes in binary64 precision:

Local Percentage Accuracy vs ?

The average percentage accuracy by input value. Horizontal axis shows value of an input variable; the variable is choosen in the title. Vertical axis is accuracy; higher is better. Red represent the original program, while blue represents Herbie's suggestion. These can be toggled with buttons below the plot. The line is an average while dots represent individual samples.

Accuracy vs Speed?

Herbie found 15 alternatives:

AlternativeAccuracySpeedup
The accuracy (vertical axis) and speed (horizontal axis) of each alternatives. Up and to the right is better. The red square shows the initial program, and each blue circle shows an alternative.The line shows the best available speed-accuracy tradeoffs.

Initial Program: 75.5% accurate, 1.0× speedup?

\[\begin{array}{l} \\ 1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \end{array} \]
(FPCore (x)
 :precision binary64
 (- 1.0 (sqrt (* 0.5 (+ 1.0 (/ 1.0 (hypot 1.0 x)))))))
double code(double x) {
	return 1.0 - sqrt((0.5 * (1.0 + (1.0 / hypot(1.0, x)))));
}
public static double code(double x) {
	return 1.0 - Math.sqrt((0.5 * (1.0 + (1.0 / Math.hypot(1.0, x)))));
}
def code(x):
	return 1.0 - math.sqrt((0.5 * (1.0 + (1.0 / math.hypot(1.0, x)))))
function code(x)
	return Float64(1.0 - sqrt(Float64(0.5 * Float64(1.0 + Float64(1.0 / hypot(1.0, x))))))
end
function tmp = code(x)
	tmp = 1.0 - sqrt((0.5 * (1.0 + (1.0 / hypot(1.0, x)))));
end
code[x_] := N[(1.0 - N[Sqrt[N[(0.5 * N[(1.0 + N[(1.0 / N[Sqrt[1.0 ^ 2 + x ^ 2], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]), $MachinePrecision]], $MachinePrecision]), $MachinePrecision]
\begin{array}{l}

\\
1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)}
\end{array}

Alternative 1: 99.8% accurate, 0.3× speedup?

\[\begin{array}{l} \\ \begin{array}{l} \mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.1:\\ \;\;\;\;0.125 \cdot {x}^{2} + \left(0.0673828125 \cdot {x}^{6} + -0.0859375 \cdot {x}^{4}\right)\\ \mathbf{else}:\\ \;\;\;\;\frac{\frac{0.125 - \frac{0.125}{{\left(\mathsf{hypot}\left(1, x\right)\right)}^{3}}}{\left(0.25 + \frac{0.25}{1 + x \cdot x}\right) + \frac{0.25}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}\\ \end{array} \end{array} \]
(FPCore (x)
 :precision binary64
 (if (<= (hypot 1.0 x) 1.1)
   (+
    (* 0.125 (pow x 2.0))
    (+ (* 0.0673828125 (pow x 6.0)) (* -0.0859375 (pow x 4.0))))
   (/
    (/
     (- 0.125 (/ 0.125 (pow (hypot 1.0 x) 3.0)))
     (+ (+ 0.25 (/ 0.25 (+ 1.0 (* x x)))) (/ 0.25 (hypot 1.0 x))))
    (+ 1.0 (sqrt (+ 0.5 (/ 0.5 (hypot 1.0 x))))))))
double code(double x) {
	double tmp;
	if (hypot(1.0, x) <= 1.1) {
		tmp = (0.125 * pow(x, 2.0)) + ((0.0673828125 * pow(x, 6.0)) + (-0.0859375 * pow(x, 4.0)));
	} else {
		tmp = ((0.125 - (0.125 / pow(hypot(1.0, x), 3.0))) / ((0.25 + (0.25 / (1.0 + (x * x)))) + (0.25 / hypot(1.0, x)))) / (1.0 + sqrt((0.5 + (0.5 / hypot(1.0, x)))));
	}
	return tmp;
}
public static double code(double x) {
	double tmp;
	if (Math.hypot(1.0, x) <= 1.1) {
		tmp = (0.125 * Math.pow(x, 2.0)) + ((0.0673828125 * Math.pow(x, 6.0)) + (-0.0859375 * Math.pow(x, 4.0)));
	} else {
		tmp = ((0.125 - (0.125 / Math.pow(Math.hypot(1.0, x), 3.0))) / ((0.25 + (0.25 / (1.0 + (x * x)))) + (0.25 / Math.hypot(1.0, x)))) / (1.0 + Math.sqrt((0.5 + (0.5 / Math.hypot(1.0, x)))));
	}
	return tmp;
}
def code(x):
	tmp = 0
	if math.hypot(1.0, x) <= 1.1:
		tmp = (0.125 * math.pow(x, 2.0)) + ((0.0673828125 * math.pow(x, 6.0)) + (-0.0859375 * math.pow(x, 4.0)))
	else:
		tmp = ((0.125 - (0.125 / math.pow(math.hypot(1.0, x), 3.0))) / ((0.25 + (0.25 / (1.0 + (x * x)))) + (0.25 / math.hypot(1.0, x)))) / (1.0 + math.sqrt((0.5 + (0.5 / math.hypot(1.0, x)))))
	return tmp
function code(x)
	tmp = 0.0
	if (hypot(1.0, x) <= 1.1)
		tmp = Float64(Float64(0.125 * (x ^ 2.0)) + Float64(Float64(0.0673828125 * (x ^ 6.0)) + Float64(-0.0859375 * (x ^ 4.0))));
	else
		tmp = Float64(Float64(Float64(0.125 - Float64(0.125 / (hypot(1.0, x) ^ 3.0))) / Float64(Float64(0.25 + Float64(0.25 / Float64(1.0 + Float64(x * x)))) + Float64(0.25 / hypot(1.0, x)))) / Float64(1.0 + sqrt(Float64(0.5 + Float64(0.5 / hypot(1.0, x))))));
	end
	return tmp
end
function tmp_2 = code(x)
	tmp = 0.0;
	if (hypot(1.0, x) <= 1.1)
		tmp = (0.125 * (x ^ 2.0)) + ((0.0673828125 * (x ^ 6.0)) + (-0.0859375 * (x ^ 4.0)));
	else
		tmp = ((0.125 - (0.125 / (hypot(1.0, x) ^ 3.0))) / ((0.25 + (0.25 / (1.0 + (x * x)))) + (0.25 / hypot(1.0, x)))) / (1.0 + sqrt((0.5 + (0.5 / hypot(1.0, x)))));
	end
	tmp_2 = tmp;
end
code[x_] := If[LessEqual[N[Sqrt[1.0 ^ 2 + x ^ 2], $MachinePrecision], 1.1], N[(N[(0.125 * N[Power[x, 2.0], $MachinePrecision]), $MachinePrecision] + N[(N[(0.0673828125 * N[Power[x, 6.0], $MachinePrecision]), $MachinePrecision] + N[(-0.0859375 * N[Power[x, 4.0], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]), $MachinePrecision], N[(N[(N[(0.125 - N[(0.125 / N[Power[N[Sqrt[1.0 ^ 2 + x ^ 2], $MachinePrecision], 3.0], $MachinePrecision]), $MachinePrecision]), $MachinePrecision] / N[(N[(0.25 + N[(0.25 / N[(1.0 + N[(x * x), $MachinePrecision]), $MachinePrecision]), $MachinePrecision]), $MachinePrecision] + N[(0.25 / N[Sqrt[1.0 ^ 2 + x ^ 2], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]), $MachinePrecision] / N[(1.0 + N[Sqrt[N[(0.5 + N[(0.5 / N[Sqrt[1.0 ^ 2 + x ^ 2], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]]
\begin{array}{l}

\\
\begin{array}{l}
\mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.1:\\
\;\;\;\;0.125 \cdot {x}^{2} + \left(0.0673828125 \cdot {x}^{6} + -0.0859375 \cdot {x}^{4}\right)\\

\mathbf{else}:\\
\;\;\;\;\frac{\frac{0.125 - \frac{0.125}{{\left(\mathsf{hypot}\left(1, x\right)\right)}^{3}}}{\left(0.25 + \frac{0.25}{1 + x \cdot x}\right) + \frac{0.25}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}\\


\end{array}
\end{array}
Derivation
  1. Split input into 2 regimes
  2. if (hypot.f64 1 x) < 1.1000000000000001

    1. Initial program 58.9%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in58.9%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval58.9%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/58.9%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval58.9%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified58.9%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Taylor expanded in x around 0 100.0%

      \[\leadsto \color{blue}{0.125 \cdot {x}^{2} + \left(0.0673828125 \cdot {x}^{6} + -0.0859375 \cdot {x}^{4}\right)} \]

    if 1.1000000000000001 < (hypot.f64 1 x)

    1. Initial program 98.4%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in98.4%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval98.4%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/98.4%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval98.4%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified98.4%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Step-by-step derivation
      1. flip--98.5%

        \[\leadsto \color{blue}{\frac{1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
      2. metadata-eval98.5%

        \[\leadsto \frac{\color{blue}{1} - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      3. add-sqr-sqrt100.0%

        \[\leadsto \frac{1 - \color{blue}{\left(0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. associate--r+100.0%

        \[\leadsto \frac{\color{blue}{\left(1 - 0.5\right) - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      5. metadata-eval100.0%

        \[\leadsto \frac{\color{blue}{0.5} - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    5. Applied egg-rr100.0%

      \[\leadsto \color{blue}{\frac{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    6. Step-by-step derivation
      1. flip3--99.9%

        \[\leadsto \frac{\color{blue}{\frac{{0.5}^{3} - {\left(\frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}^{3}}{0.5 \cdot 0.5 + \left(\frac{0.5}{\mathsf{hypot}\left(1, x\right)} \cdot \frac{0.5}{\mathsf{hypot}\left(1, x\right)} + 0.5 \cdot \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval99.9%

        \[\leadsto \frac{\frac{\color{blue}{0.125} - {\left(\frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}^{3}}{0.5 \cdot 0.5 + \left(\frac{0.5}{\mathsf{hypot}\left(1, x\right)} \cdot \frac{0.5}{\mathsf{hypot}\left(1, x\right)} + 0.5 \cdot \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      3. cube-div100.0%

        \[\leadsto \frac{\frac{0.125 - \color{blue}{\frac{{0.5}^{3}}{{\left(\mathsf{hypot}\left(1, x\right)\right)}^{3}}}}{0.5 \cdot 0.5 + \left(\frac{0.5}{\mathsf{hypot}\left(1, x\right)} \cdot \frac{0.5}{\mathsf{hypot}\left(1, x\right)} + 0.5 \cdot \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval100.0%

        \[\leadsto \frac{\frac{0.125 - \frac{\color{blue}{0.125}}{{\left(\mathsf{hypot}\left(1, x\right)\right)}^{3}}}{0.5 \cdot 0.5 + \left(\frac{0.5}{\mathsf{hypot}\left(1, x\right)} \cdot \frac{0.5}{\mathsf{hypot}\left(1, x\right)} + 0.5 \cdot \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      5. associate-+r+100.0%

        \[\leadsto \frac{\frac{0.125 - \frac{0.125}{{\left(\mathsf{hypot}\left(1, x\right)\right)}^{3}}}{\color{blue}{\left(0.5 \cdot 0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)} \cdot \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) + 0.5 \cdot \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      6. metadata-eval100.0%

        \[\leadsto \frac{\frac{0.125 - \frac{0.125}{{\left(\mathsf{hypot}\left(1, x\right)\right)}^{3}}}{\left(\color{blue}{0.25} + \frac{0.5}{\mathsf{hypot}\left(1, x\right)} \cdot \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) + 0.5 \cdot \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      7. frac-times100.0%

        \[\leadsto \frac{\frac{0.125 - \frac{0.125}{{\left(\mathsf{hypot}\left(1, x\right)\right)}^{3}}}{\left(0.25 + \color{blue}{\frac{0.5 \cdot 0.5}{\mathsf{hypot}\left(1, x\right) \cdot \mathsf{hypot}\left(1, x\right)}}\right) + 0.5 \cdot \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      8. metadata-eval100.0%

        \[\leadsto \frac{\frac{0.125 - \frac{0.125}{{\left(\mathsf{hypot}\left(1, x\right)\right)}^{3}}}{\left(0.25 + \frac{\color{blue}{0.25}}{\mathsf{hypot}\left(1, x\right) \cdot \mathsf{hypot}\left(1, x\right)}\right) + 0.5 \cdot \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      9. hypot-udef100.0%

        \[\leadsto \frac{\frac{0.125 - \frac{0.125}{{\left(\mathsf{hypot}\left(1, x\right)\right)}^{3}}}{\left(0.25 + \frac{0.25}{\color{blue}{\sqrt{1 \cdot 1 + x \cdot x}} \cdot \mathsf{hypot}\left(1, x\right)}\right) + 0.5 \cdot \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      10. hypot-udef100.0%

        \[\leadsto \frac{\frac{0.125 - \frac{0.125}{{\left(\mathsf{hypot}\left(1, x\right)\right)}^{3}}}{\left(0.25 + \frac{0.25}{\sqrt{1 \cdot 1 + x \cdot x} \cdot \color{blue}{\sqrt{1 \cdot 1 + x \cdot x}}}\right) + 0.5 \cdot \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      11. add-sqr-sqrt100.0%

        \[\leadsto \frac{\frac{0.125 - \frac{0.125}{{\left(\mathsf{hypot}\left(1, x\right)\right)}^{3}}}{\left(0.25 + \frac{0.25}{\color{blue}{1 \cdot 1 + x \cdot x}}\right) + 0.5 \cdot \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      12. metadata-eval100.0%

        \[\leadsto \frac{\frac{0.125 - \frac{0.125}{{\left(\mathsf{hypot}\left(1, x\right)\right)}^{3}}}{\left(0.25 + \frac{0.25}{\color{blue}{1} + x \cdot x}\right) + 0.5 \cdot \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      13. associate-*r/100.0%

        \[\leadsto \frac{\frac{0.125 - \frac{0.125}{{\left(\mathsf{hypot}\left(1, x\right)\right)}^{3}}}{\left(0.25 + \frac{0.25}{1 + x \cdot x}\right) + \color{blue}{\frac{0.5 \cdot 0.5}{\mathsf{hypot}\left(1, x\right)}}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      14. metadata-eval100.0%

        \[\leadsto \frac{\frac{0.125 - \frac{0.125}{{\left(\mathsf{hypot}\left(1, x\right)\right)}^{3}}}{\left(0.25 + \frac{0.25}{1 + x \cdot x}\right) + \frac{\color{blue}{0.25}}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    7. Applied egg-rr100.0%

      \[\leadsto \frac{\color{blue}{\frac{0.125 - \frac{0.125}{{\left(\mathsf{hypot}\left(1, x\right)\right)}^{3}}}{\left(0.25 + \frac{0.25}{1 + x \cdot x}\right) + \frac{0.25}{\mathsf{hypot}\left(1, x\right)}}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
  3. Recombined 2 regimes into one program.
  4. Final simplification100.0%

    \[\leadsto \begin{array}{l} \mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.1:\\ \;\;\;\;0.125 \cdot {x}^{2} + \left(0.0673828125 \cdot {x}^{6} + -0.0859375 \cdot {x}^{4}\right)\\ \mathbf{else}:\\ \;\;\;\;\frac{\frac{0.125 - \frac{0.125}{{\left(\mathsf{hypot}\left(1, x\right)\right)}^{3}}}{\left(0.25 + \frac{0.25}{1 + x \cdot x}\right) + \frac{0.25}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}\\ \end{array} \]

Alternative 2: 99.9% accurate, 0.5× speedup?

\[\begin{array}{l} \\ \begin{array}{l} t_0 := \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\\ \mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.1:\\ \;\;\;\;0.125 \cdot {x}^{2} + \left(0.0673828125 \cdot {x}^{6} + -0.0859375 \cdot {x}^{4}\right)\\ \mathbf{else}:\\ \;\;\;\;\frac{1}{\frac{1 + \sqrt{0.5 + t_0}}{0.5 - t_0}}\\ \end{array} \end{array} \]
(FPCore (x)
 :precision binary64
 (let* ((t_0 (/ 0.5 (hypot 1.0 x))))
   (if (<= (hypot 1.0 x) 1.1)
     (+
      (* 0.125 (pow x 2.0))
      (+ (* 0.0673828125 (pow x 6.0)) (* -0.0859375 (pow x 4.0))))
     (/ 1.0 (/ (+ 1.0 (sqrt (+ 0.5 t_0))) (- 0.5 t_0))))))
double code(double x) {
	double t_0 = 0.5 / hypot(1.0, x);
	double tmp;
	if (hypot(1.0, x) <= 1.1) {
		tmp = (0.125 * pow(x, 2.0)) + ((0.0673828125 * pow(x, 6.0)) + (-0.0859375 * pow(x, 4.0)));
	} else {
		tmp = 1.0 / ((1.0 + sqrt((0.5 + t_0))) / (0.5 - t_0));
	}
	return tmp;
}
public static double code(double x) {
	double t_0 = 0.5 / Math.hypot(1.0, x);
	double tmp;
	if (Math.hypot(1.0, x) <= 1.1) {
		tmp = (0.125 * Math.pow(x, 2.0)) + ((0.0673828125 * Math.pow(x, 6.0)) + (-0.0859375 * Math.pow(x, 4.0)));
	} else {
		tmp = 1.0 / ((1.0 + Math.sqrt((0.5 + t_0))) / (0.5 - t_0));
	}
	return tmp;
}
def code(x):
	t_0 = 0.5 / math.hypot(1.0, x)
	tmp = 0
	if math.hypot(1.0, x) <= 1.1:
		tmp = (0.125 * math.pow(x, 2.0)) + ((0.0673828125 * math.pow(x, 6.0)) + (-0.0859375 * math.pow(x, 4.0)))
	else:
		tmp = 1.0 / ((1.0 + math.sqrt((0.5 + t_0))) / (0.5 - t_0))
	return tmp
function code(x)
	t_0 = Float64(0.5 / hypot(1.0, x))
	tmp = 0.0
	if (hypot(1.0, x) <= 1.1)
		tmp = Float64(Float64(0.125 * (x ^ 2.0)) + Float64(Float64(0.0673828125 * (x ^ 6.0)) + Float64(-0.0859375 * (x ^ 4.0))));
	else
		tmp = Float64(1.0 / Float64(Float64(1.0 + sqrt(Float64(0.5 + t_0))) / Float64(0.5 - t_0)));
	end
	return tmp
end
function tmp_2 = code(x)
	t_0 = 0.5 / hypot(1.0, x);
	tmp = 0.0;
	if (hypot(1.0, x) <= 1.1)
		tmp = (0.125 * (x ^ 2.0)) + ((0.0673828125 * (x ^ 6.0)) + (-0.0859375 * (x ^ 4.0)));
	else
		tmp = 1.0 / ((1.0 + sqrt((0.5 + t_0))) / (0.5 - t_0));
	end
	tmp_2 = tmp;
end
code[x_] := Block[{t$95$0 = N[(0.5 / N[Sqrt[1.0 ^ 2 + x ^ 2], $MachinePrecision]), $MachinePrecision]}, If[LessEqual[N[Sqrt[1.0 ^ 2 + x ^ 2], $MachinePrecision], 1.1], N[(N[(0.125 * N[Power[x, 2.0], $MachinePrecision]), $MachinePrecision] + N[(N[(0.0673828125 * N[Power[x, 6.0], $MachinePrecision]), $MachinePrecision] + N[(-0.0859375 * N[Power[x, 4.0], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]), $MachinePrecision], N[(1.0 / N[(N[(1.0 + N[Sqrt[N[(0.5 + t$95$0), $MachinePrecision]], $MachinePrecision]), $MachinePrecision] / N[(0.5 - t$95$0), $MachinePrecision]), $MachinePrecision]), $MachinePrecision]]]
\begin{array}{l}

\\
\begin{array}{l}
t_0 := \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\\
\mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.1:\\
\;\;\;\;0.125 \cdot {x}^{2} + \left(0.0673828125 \cdot {x}^{6} + -0.0859375 \cdot {x}^{4}\right)\\

\mathbf{else}:\\
\;\;\;\;\frac{1}{\frac{1 + \sqrt{0.5 + t_0}}{0.5 - t_0}}\\


\end{array}
\end{array}
Derivation
  1. Split input into 2 regimes
  2. if (hypot.f64 1 x) < 1.1000000000000001

    1. Initial program 58.9%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in58.9%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval58.9%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/58.9%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval58.9%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified58.9%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Taylor expanded in x around 0 100.0%

      \[\leadsto \color{blue}{0.125 \cdot {x}^{2} + \left(0.0673828125 \cdot {x}^{6} + -0.0859375 \cdot {x}^{4}\right)} \]

    if 1.1000000000000001 < (hypot.f64 1 x)

    1. Initial program 98.4%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in98.4%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval98.4%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/98.4%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval98.4%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified98.4%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Step-by-step derivation
      1. flip--98.5%

        \[\leadsto \color{blue}{\frac{1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
      2. div-inv98.5%

        \[\leadsto \color{blue}{\left(1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
      3. metadata-eval98.5%

        \[\leadsto \left(\color{blue}{1} - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. add-sqr-sqrt100.0%

        \[\leadsto \left(1 - \color{blue}{\left(0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      5. associate--r+100.0%

        \[\leadsto \color{blue}{\left(\left(1 - 0.5\right) - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      6. metadata-eval100.0%

        \[\leadsto \left(\color{blue}{0.5} - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    5. Applied egg-rr100.0%

      \[\leadsto \color{blue}{\left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    6. Step-by-step derivation
      1. *-commutative100.0%

        \[\leadsto \color{blue}{\frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \cdot \left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \]
      2. associate-/r/100.0%

        \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    7. Simplified100.0%

      \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
  3. Recombined 2 regimes into one program.
  4. Final simplification100.0%

    \[\leadsto \begin{array}{l} \mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.1:\\ \;\;\;\;0.125 \cdot {x}^{2} + \left(0.0673828125 \cdot {x}^{6} + -0.0859375 \cdot {x}^{4}\right)\\ \mathbf{else}:\\ \;\;\;\;\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}\\ \end{array} \]

Alternative 3: 99.9% accurate, 0.5× speedup?

\[\begin{array}{l} \\ \begin{array}{l} t_0 := \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\\ \mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.1:\\ \;\;\;\;0.125 \cdot {x}^{2} + \left(0.0673828125 \cdot {x}^{6} + -0.0859375 \cdot {x}^{4}\right)\\ \mathbf{else}:\\ \;\;\;\;\frac{0.5 - t_0}{1 + \sqrt{0.5 + t_0}}\\ \end{array} \end{array} \]
(FPCore (x)
 :precision binary64
 (let* ((t_0 (/ 0.5 (hypot 1.0 x))))
   (if (<= (hypot 1.0 x) 1.1)
     (+
      (* 0.125 (pow x 2.0))
      (+ (* 0.0673828125 (pow x 6.0)) (* -0.0859375 (pow x 4.0))))
     (/ (- 0.5 t_0) (+ 1.0 (sqrt (+ 0.5 t_0)))))))
double code(double x) {
	double t_0 = 0.5 / hypot(1.0, x);
	double tmp;
	if (hypot(1.0, x) <= 1.1) {
		tmp = (0.125 * pow(x, 2.0)) + ((0.0673828125 * pow(x, 6.0)) + (-0.0859375 * pow(x, 4.0)));
	} else {
		tmp = (0.5 - t_0) / (1.0 + sqrt((0.5 + t_0)));
	}
	return tmp;
}
public static double code(double x) {
	double t_0 = 0.5 / Math.hypot(1.0, x);
	double tmp;
	if (Math.hypot(1.0, x) <= 1.1) {
		tmp = (0.125 * Math.pow(x, 2.0)) + ((0.0673828125 * Math.pow(x, 6.0)) + (-0.0859375 * Math.pow(x, 4.0)));
	} else {
		tmp = (0.5 - t_0) / (1.0 + Math.sqrt((0.5 + t_0)));
	}
	return tmp;
}
def code(x):
	t_0 = 0.5 / math.hypot(1.0, x)
	tmp = 0
	if math.hypot(1.0, x) <= 1.1:
		tmp = (0.125 * math.pow(x, 2.0)) + ((0.0673828125 * math.pow(x, 6.0)) + (-0.0859375 * math.pow(x, 4.0)))
	else:
		tmp = (0.5 - t_0) / (1.0 + math.sqrt((0.5 + t_0)))
	return tmp
function code(x)
	t_0 = Float64(0.5 / hypot(1.0, x))
	tmp = 0.0
	if (hypot(1.0, x) <= 1.1)
		tmp = Float64(Float64(0.125 * (x ^ 2.0)) + Float64(Float64(0.0673828125 * (x ^ 6.0)) + Float64(-0.0859375 * (x ^ 4.0))));
	else
		tmp = Float64(Float64(0.5 - t_0) / Float64(1.0 + sqrt(Float64(0.5 + t_0))));
	end
	return tmp
end
function tmp_2 = code(x)
	t_0 = 0.5 / hypot(1.0, x);
	tmp = 0.0;
	if (hypot(1.0, x) <= 1.1)
		tmp = (0.125 * (x ^ 2.0)) + ((0.0673828125 * (x ^ 6.0)) + (-0.0859375 * (x ^ 4.0)));
	else
		tmp = (0.5 - t_0) / (1.0 + sqrt((0.5 + t_0)));
	end
	tmp_2 = tmp;
end
code[x_] := Block[{t$95$0 = N[(0.5 / N[Sqrt[1.0 ^ 2 + x ^ 2], $MachinePrecision]), $MachinePrecision]}, If[LessEqual[N[Sqrt[1.0 ^ 2 + x ^ 2], $MachinePrecision], 1.1], N[(N[(0.125 * N[Power[x, 2.0], $MachinePrecision]), $MachinePrecision] + N[(N[(0.0673828125 * N[Power[x, 6.0], $MachinePrecision]), $MachinePrecision] + N[(-0.0859375 * N[Power[x, 4.0], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]), $MachinePrecision], N[(N[(0.5 - t$95$0), $MachinePrecision] / N[(1.0 + N[Sqrt[N[(0.5 + t$95$0), $MachinePrecision]], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]]]
\begin{array}{l}

\\
\begin{array}{l}
t_0 := \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\\
\mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.1:\\
\;\;\;\;0.125 \cdot {x}^{2} + \left(0.0673828125 \cdot {x}^{6} + -0.0859375 \cdot {x}^{4}\right)\\

\mathbf{else}:\\
\;\;\;\;\frac{0.5 - t_0}{1 + \sqrt{0.5 + t_0}}\\


\end{array}
\end{array}
Derivation
  1. Split input into 2 regimes
  2. if (hypot.f64 1 x) < 1.1000000000000001

    1. Initial program 58.9%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in58.9%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval58.9%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/58.9%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval58.9%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified58.9%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Taylor expanded in x around 0 100.0%

      \[\leadsto \color{blue}{0.125 \cdot {x}^{2} + \left(0.0673828125 \cdot {x}^{6} + -0.0859375 \cdot {x}^{4}\right)} \]

    if 1.1000000000000001 < (hypot.f64 1 x)

    1. Initial program 98.4%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in98.4%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval98.4%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/98.4%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval98.4%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified98.4%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Step-by-step derivation
      1. flip--98.5%

        \[\leadsto \color{blue}{\frac{1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
      2. metadata-eval98.5%

        \[\leadsto \frac{\color{blue}{1} - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      3. add-sqr-sqrt100.0%

        \[\leadsto \frac{1 - \color{blue}{\left(0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. associate--r+100.0%

        \[\leadsto \frac{\color{blue}{\left(1 - 0.5\right) - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      5. metadata-eval100.0%

        \[\leadsto \frac{\color{blue}{0.5} - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    5. Applied egg-rr100.0%

      \[\leadsto \color{blue}{\frac{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
  3. Recombined 2 regimes into one program.
  4. Final simplification100.0%

    \[\leadsto \begin{array}{l} \mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.1:\\ \;\;\;\;0.125 \cdot {x}^{2} + \left(0.0673828125 \cdot {x}^{6} + -0.0859375 \cdot {x}^{4}\right)\\ \mathbf{else}:\\ \;\;\;\;\frac{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}\\ \end{array} \]

Alternative 4: 99.1% accurate, 0.5× speedup?

\[\begin{array}{l} \\ \begin{array}{l} \mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.1:\\ \;\;\;\;0.125 \cdot {x}^{2} + \left(0.0673828125 \cdot {x}^{6} + -0.0859375 \cdot {x}^{4}\right)\\ \mathbf{else}:\\ \;\;\;\;\frac{1}{\frac{1}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}\\ \end{array} \end{array} \]
(FPCore (x)
 :precision binary64
 (if (<= (hypot 1.0 x) 1.1)
   (+
    (* 0.125 (pow x 2.0))
    (+ (* 0.0673828125 (pow x 6.0)) (* -0.0859375 (pow x 4.0))))
   (/ 1.0 (/ 1.0 (- 1.0 (sqrt (+ 0.5 (/ 0.5 (hypot 1.0 x)))))))))
double code(double x) {
	double tmp;
	if (hypot(1.0, x) <= 1.1) {
		tmp = (0.125 * pow(x, 2.0)) + ((0.0673828125 * pow(x, 6.0)) + (-0.0859375 * pow(x, 4.0)));
	} else {
		tmp = 1.0 / (1.0 / (1.0 - sqrt((0.5 + (0.5 / hypot(1.0, x))))));
	}
	return tmp;
}
public static double code(double x) {
	double tmp;
	if (Math.hypot(1.0, x) <= 1.1) {
		tmp = (0.125 * Math.pow(x, 2.0)) + ((0.0673828125 * Math.pow(x, 6.0)) + (-0.0859375 * Math.pow(x, 4.0)));
	} else {
		tmp = 1.0 / (1.0 / (1.0 - Math.sqrt((0.5 + (0.5 / Math.hypot(1.0, x))))));
	}
	return tmp;
}
def code(x):
	tmp = 0
	if math.hypot(1.0, x) <= 1.1:
		tmp = (0.125 * math.pow(x, 2.0)) + ((0.0673828125 * math.pow(x, 6.0)) + (-0.0859375 * math.pow(x, 4.0)))
	else:
		tmp = 1.0 / (1.0 / (1.0 - math.sqrt((0.5 + (0.5 / math.hypot(1.0, x))))))
	return tmp
function code(x)
	tmp = 0.0
	if (hypot(1.0, x) <= 1.1)
		tmp = Float64(Float64(0.125 * (x ^ 2.0)) + Float64(Float64(0.0673828125 * (x ^ 6.0)) + Float64(-0.0859375 * (x ^ 4.0))));
	else
		tmp = Float64(1.0 / Float64(1.0 / Float64(1.0 - sqrt(Float64(0.5 + Float64(0.5 / hypot(1.0, x)))))));
	end
	return tmp
end
function tmp_2 = code(x)
	tmp = 0.0;
	if (hypot(1.0, x) <= 1.1)
		tmp = (0.125 * (x ^ 2.0)) + ((0.0673828125 * (x ^ 6.0)) + (-0.0859375 * (x ^ 4.0)));
	else
		tmp = 1.0 / (1.0 / (1.0 - sqrt((0.5 + (0.5 / hypot(1.0, x))))));
	end
	tmp_2 = tmp;
end
code[x_] := If[LessEqual[N[Sqrt[1.0 ^ 2 + x ^ 2], $MachinePrecision], 1.1], N[(N[(0.125 * N[Power[x, 2.0], $MachinePrecision]), $MachinePrecision] + N[(N[(0.0673828125 * N[Power[x, 6.0], $MachinePrecision]), $MachinePrecision] + N[(-0.0859375 * N[Power[x, 4.0], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]), $MachinePrecision], N[(1.0 / N[(1.0 / N[(1.0 - N[Sqrt[N[(0.5 + N[(0.5 / N[Sqrt[1.0 ^ 2 + x ^ 2], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]), $MachinePrecision]]
\begin{array}{l}

\\
\begin{array}{l}
\mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.1:\\
\;\;\;\;0.125 \cdot {x}^{2} + \left(0.0673828125 \cdot {x}^{6} + -0.0859375 \cdot {x}^{4}\right)\\

\mathbf{else}:\\
\;\;\;\;\frac{1}{\frac{1}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}\\


\end{array}
\end{array}
Derivation
  1. Split input into 2 regimes
  2. if (hypot.f64 1 x) < 1.1000000000000001

    1. Initial program 58.9%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in58.9%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval58.9%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/58.9%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval58.9%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified58.9%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Taylor expanded in x around 0 100.0%

      \[\leadsto \color{blue}{0.125 \cdot {x}^{2} + \left(0.0673828125 \cdot {x}^{6} + -0.0859375 \cdot {x}^{4}\right)} \]

    if 1.1000000000000001 < (hypot.f64 1 x)

    1. Initial program 98.4%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in98.4%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval98.4%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/98.4%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval98.4%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified98.4%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Step-by-step derivation
      1. flip--98.5%

        \[\leadsto \color{blue}{\frac{1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
      2. div-inv98.5%

        \[\leadsto \color{blue}{\left(1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
      3. metadata-eval98.5%

        \[\leadsto \left(\color{blue}{1} - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. add-sqr-sqrt100.0%

        \[\leadsto \left(1 - \color{blue}{\left(0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      5. associate--r+100.0%

        \[\leadsto \color{blue}{\left(\left(1 - 0.5\right) - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      6. metadata-eval100.0%

        \[\leadsto \left(\color{blue}{0.5} - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    5. Applied egg-rr100.0%

      \[\leadsto \color{blue}{\left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    6. Step-by-step derivation
      1. *-commutative100.0%

        \[\leadsto \color{blue}{\frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \cdot \left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \]
      2. associate-/r/100.0%

        \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    7. Simplified100.0%

      \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    8. Step-by-step derivation
      1. expm1-log1p-u98.4%

        \[\leadsto \frac{1}{\color{blue}{\mathsf{expm1}\left(\mathsf{log1p}\left(\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right)\right)}} \]
      2. expm1-udef98.4%

        \[\leadsto \frac{1}{\color{blue}{e^{\mathsf{log1p}\left(\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right)} - 1}} \]
      3. clear-num98.4%

        \[\leadsto \frac{1}{e^{\mathsf{log1p}\left(\color{blue}{\frac{1}{\frac{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}}\right)} - 1} \]
      4. metadata-eval98.4%

        \[\leadsto \frac{1}{e^{\mathsf{log1p}\left(\frac{1}{\frac{\color{blue}{\left(1 - 0.5\right)} - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}\right)} - 1} \]
      5. associate--r+98.4%

        \[\leadsto \frac{1}{e^{\mathsf{log1p}\left(\frac{1}{\frac{\color{blue}{1 - \left(0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}\right)} - 1} \]
      6. metadata-eval98.4%

        \[\leadsto \frac{1}{e^{\mathsf{log1p}\left(\frac{1}{\frac{\color{blue}{1 \cdot 1} - \left(0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}\right)} - 1} \]
      7. add-sqr-sqrt98.4%

        \[\leadsto \frac{1}{e^{\mathsf{log1p}\left(\frac{1}{\frac{1 \cdot 1 - \color{blue}{\sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}\right)} - 1} \]
      8. flip--98.4%

        \[\leadsto \frac{1}{e^{\mathsf{log1p}\left(\frac{1}{\color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}\right)} - 1} \]
    9. Applied egg-rr98.4%

      \[\leadsto \frac{1}{\color{blue}{e^{\mathsf{log1p}\left(\frac{1}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}\right)} - 1}} \]
    10. Step-by-step derivation
      1. expm1-def98.4%

        \[\leadsto \frac{1}{\color{blue}{\mathsf{expm1}\left(\mathsf{log1p}\left(\frac{1}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}\right)\right)}} \]
      2. expm1-log1p98.5%

        \[\leadsto \frac{1}{\color{blue}{\frac{1}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}} \]
    11. Simplified98.5%

      \[\leadsto \frac{1}{\color{blue}{\frac{1}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}} \]
  3. Recombined 2 regimes into one program.
  4. Final simplification99.3%

    \[\leadsto \begin{array}{l} \mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.1:\\ \;\;\;\;0.125 \cdot {x}^{2} + \left(0.0673828125 \cdot {x}^{6} + -0.0859375 \cdot {x}^{4}\right)\\ \mathbf{else}:\\ \;\;\;\;\frac{1}{\frac{1}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}\\ \end{array} \]

Alternative 5: 99.0% accurate, 0.7× speedup?

\[\begin{array}{l} \\ \begin{array}{l} \mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.1:\\ \;\;\;\;\mathsf{fma}\left(0.125, x \cdot x, -0.0859375 \cdot {x}^{4}\right)\\ \mathbf{else}:\\ \;\;\;\;\frac{1}{\frac{1}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}\\ \end{array} \end{array} \]
(FPCore (x)
 :precision binary64
 (if (<= (hypot 1.0 x) 1.1)
   (fma 0.125 (* x x) (* -0.0859375 (pow x 4.0)))
   (/ 1.0 (/ 1.0 (- 1.0 (sqrt (+ 0.5 (/ 0.5 (hypot 1.0 x)))))))))
double code(double x) {
	double tmp;
	if (hypot(1.0, x) <= 1.1) {
		tmp = fma(0.125, (x * x), (-0.0859375 * pow(x, 4.0)));
	} else {
		tmp = 1.0 / (1.0 / (1.0 - sqrt((0.5 + (0.5 / hypot(1.0, x))))));
	}
	return tmp;
}
function code(x)
	tmp = 0.0
	if (hypot(1.0, x) <= 1.1)
		tmp = fma(0.125, Float64(x * x), Float64(-0.0859375 * (x ^ 4.0)));
	else
		tmp = Float64(1.0 / Float64(1.0 / Float64(1.0 - sqrt(Float64(0.5 + Float64(0.5 / hypot(1.0, x)))))));
	end
	return tmp
end
code[x_] := If[LessEqual[N[Sqrt[1.0 ^ 2 + x ^ 2], $MachinePrecision], 1.1], N[(0.125 * N[(x * x), $MachinePrecision] + N[(-0.0859375 * N[Power[x, 4.0], $MachinePrecision]), $MachinePrecision]), $MachinePrecision], N[(1.0 / N[(1.0 / N[(1.0 - N[Sqrt[N[(0.5 + N[(0.5 / N[Sqrt[1.0 ^ 2 + x ^ 2], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]), $MachinePrecision]]
\begin{array}{l}

\\
\begin{array}{l}
\mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.1:\\
\;\;\;\;\mathsf{fma}\left(0.125, x \cdot x, -0.0859375 \cdot {x}^{4}\right)\\

\mathbf{else}:\\
\;\;\;\;\frac{1}{\frac{1}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}\\


\end{array}
\end{array}
Derivation
  1. Split input into 2 regimes
  2. if (hypot.f64 1 x) < 1.1000000000000001

    1. Initial program 58.9%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in58.9%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval58.9%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/58.9%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval58.9%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified58.9%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Taylor expanded in x around 0 99.9%

      \[\leadsto \color{blue}{0.125 \cdot {x}^{2} + -0.0859375 \cdot {x}^{4}} \]
    5. Step-by-step derivation
      1. fma-def99.9%

        \[\leadsto \color{blue}{\mathsf{fma}\left(0.125, {x}^{2}, -0.0859375 \cdot {x}^{4}\right)} \]
      2. unpow299.9%

        \[\leadsto \mathsf{fma}\left(0.125, \color{blue}{x \cdot x}, -0.0859375 \cdot {x}^{4}\right) \]
    6. Simplified99.9%

      \[\leadsto \color{blue}{\mathsf{fma}\left(0.125, x \cdot x, -0.0859375 \cdot {x}^{4}\right)} \]

    if 1.1000000000000001 < (hypot.f64 1 x)

    1. Initial program 98.4%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in98.4%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval98.4%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/98.4%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval98.4%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified98.4%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Step-by-step derivation
      1. flip--98.5%

        \[\leadsto \color{blue}{\frac{1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
      2. div-inv98.5%

        \[\leadsto \color{blue}{\left(1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
      3. metadata-eval98.5%

        \[\leadsto \left(\color{blue}{1} - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. add-sqr-sqrt100.0%

        \[\leadsto \left(1 - \color{blue}{\left(0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      5. associate--r+100.0%

        \[\leadsto \color{blue}{\left(\left(1 - 0.5\right) - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      6. metadata-eval100.0%

        \[\leadsto \left(\color{blue}{0.5} - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    5. Applied egg-rr100.0%

      \[\leadsto \color{blue}{\left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    6. Step-by-step derivation
      1. *-commutative100.0%

        \[\leadsto \color{blue}{\frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \cdot \left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \]
      2. associate-/r/100.0%

        \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    7. Simplified100.0%

      \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    8. Step-by-step derivation
      1. expm1-log1p-u98.4%

        \[\leadsto \frac{1}{\color{blue}{\mathsf{expm1}\left(\mathsf{log1p}\left(\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right)\right)}} \]
      2. expm1-udef98.4%

        \[\leadsto \frac{1}{\color{blue}{e^{\mathsf{log1p}\left(\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right)} - 1}} \]
      3. clear-num98.4%

        \[\leadsto \frac{1}{e^{\mathsf{log1p}\left(\color{blue}{\frac{1}{\frac{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}}\right)} - 1} \]
      4. metadata-eval98.4%

        \[\leadsto \frac{1}{e^{\mathsf{log1p}\left(\frac{1}{\frac{\color{blue}{\left(1 - 0.5\right)} - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}\right)} - 1} \]
      5. associate--r+98.4%

        \[\leadsto \frac{1}{e^{\mathsf{log1p}\left(\frac{1}{\frac{\color{blue}{1 - \left(0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}\right)} - 1} \]
      6. metadata-eval98.4%

        \[\leadsto \frac{1}{e^{\mathsf{log1p}\left(\frac{1}{\frac{\color{blue}{1 \cdot 1} - \left(0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}\right)} - 1} \]
      7. add-sqr-sqrt98.4%

        \[\leadsto \frac{1}{e^{\mathsf{log1p}\left(\frac{1}{\frac{1 \cdot 1 - \color{blue}{\sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}\right)} - 1} \]
      8. flip--98.4%

        \[\leadsto \frac{1}{e^{\mathsf{log1p}\left(\frac{1}{\color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}\right)} - 1} \]
    9. Applied egg-rr98.4%

      \[\leadsto \frac{1}{\color{blue}{e^{\mathsf{log1p}\left(\frac{1}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}\right)} - 1}} \]
    10. Step-by-step derivation
      1. expm1-def98.4%

        \[\leadsto \frac{1}{\color{blue}{\mathsf{expm1}\left(\mathsf{log1p}\left(\frac{1}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}\right)\right)}} \]
      2. expm1-log1p98.5%

        \[\leadsto \frac{1}{\color{blue}{\frac{1}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}} \]
    11. Simplified98.5%

      \[\leadsto \frac{1}{\color{blue}{\frac{1}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}} \]
  3. Recombined 2 regimes into one program.
  4. Final simplification99.3%

    \[\leadsto \begin{array}{l} \mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.1:\\ \;\;\;\;\mathsf{fma}\left(0.125, x \cdot x, -0.0859375 \cdot {x}^{4}\right)\\ \mathbf{else}:\\ \;\;\;\;\frac{1}{\frac{1}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}}\\ \end{array} \]

Alternative 6: 99.0% accurate, 0.7× speedup?

\[\begin{array}{l} \\ \begin{array}{l} \mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.0000002:\\ \;\;\;\;0.125 \cdot \left(x \cdot x\right)\\ \mathbf{else}:\\ \;\;\;\;1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\\ \end{array} \end{array} \]
(FPCore (x)
 :precision binary64
 (if (<= (hypot 1.0 x) 1.0000002)
   (* 0.125 (* x x))
   (- 1.0 (sqrt (+ 0.5 (/ 0.5 (hypot 1.0 x)))))))
double code(double x) {
	double tmp;
	if (hypot(1.0, x) <= 1.0000002) {
		tmp = 0.125 * (x * x);
	} else {
		tmp = 1.0 - sqrt((0.5 + (0.5 / hypot(1.0, x))));
	}
	return tmp;
}
public static double code(double x) {
	double tmp;
	if (Math.hypot(1.0, x) <= 1.0000002) {
		tmp = 0.125 * (x * x);
	} else {
		tmp = 1.0 - Math.sqrt((0.5 + (0.5 / Math.hypot(1.0, x))));
	}
	return tmp;
}
def code(x):
	tmp = 0
	if math.hypot(1.0, x) <= 1.0000002:
		tmp = 0.125 * (x * x)
	else:
		tmp = 1.0 - math.sqrt((0.5 + (0.5 / math.hypot(1.0, x))))
	return tmp
function code(x)
	tmp = 0.0
	if (hypot(1.0, x) <= 1.0000002)
		tmp = Float64(0.125 * Float64(x * x));
	else
		tmp = Float64(1.0 - sqrt(Float64(0.5 + Float64(0.5 / hypot(1.0, x)))));
	end
	return tmp
end
function tmp_2 = code(x)
	tmp = 0.0;
	if (hypot(1.0, x) <= 1.0000002)
		tmp = 0.125 * (x * x);
	else
		tmp = 1.0 - sqrt((0.5 + (0.5 / hypot(1.0, x))));
	end
	tmp_2 = tmp;
end
code[x_] := If[LessEqual[N[Sqrt[1.0 ^ 2 + x ^ 2], $MachinePrecision], 1.0000002], N[(0.125 * N[(x * x), $MachinePrecision]), $MachinePrecision], N[(1.0 - N[Sqrt[N[(0.5 + N[(0.5 / N[Sqrt[1.0 ^ 2 + x ^ 2], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]], $MachinePrecision]), $MachinePrecision]]
\begin{array}{l}

\\
\begin{array}{l}
\mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.0000002:\\
\;\;\;\;0.125 \cdot \left(x \cdot x\right)\\

\mathbf{else}:\\
\;\;\;\;1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\\


\end{array}
\end{array}
Derivation
  1. Split input into 2 regimes
  2. if (hypot.f64 1 x) < 1.00000019999999989

    1. Initial program 58.9%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in58.9%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval58.9%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/58.9%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval58.9%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified58.9%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Taylor expanded in x around 0 99.8%

      \[\leadsto \color{blue}{0.125 \cdot {x}^{2}} \]
    5. Step-by-step derivation
      1. unpow299.8%

        \[\leadsto 0.125 \cdot \color{blue}{\left(x \cdot x\right)} \]
    6. Simplified99.8%

      \[\leadsto \color{blue}{0.125 \cdot \left(x \cdot x\right)} \]

    if 1.00000019999999989 < (hypot.f64 1 x)

    1. Initial program 98.2%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in98.2%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval98.2%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/98.2%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval98.2%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified98.2%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
  3. Recombined 2 regimes into one program.
  4. Final simplification99.1%

    \[\leadsto \begin{array}{l} \mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.0000002:\\ \;\;\;\;0.125 \cdot \left(x \cdot x\right)\\ \mathbf{else}:\\ \;\;\;\;1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\\ \end{array} \]

Alternative 7: 99.0% accurate, 0.7× speedup?

\[\begin{array}{l} \\ \begin{array}{l} \mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.1:\\ \;\;\;\;\mathsf{fma}\left(0.125, x \cdot x, -0.0859375 \cdot {x}^{4}\right)\\ \mathbf{else}:\\ \;\;\;\;1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\\ \end{array} \end{array} \]
(FPCore (x)
 :precision binary64
 (if (<= (hypot 1.0 x) 1.1)
   (fma 0.125 (* x x) (* -0.0859375 (pow x 4.0)))
   (- 1.0 (sqrt (+ 0.5 (/ 0.5 (hypot 1.0 x)))))))
double code(double x) {
	double tmp;
	if (hypot(1.0, x) <= 1.1) {
		tmp = fma(0.125, (x * x), (-0.0859375 * pow(x, 4.0)));
	} else {
		tmp = 1.0 - sqrt((0.5 + (0.5 / hypot(1.0, x))));
	}
	return tmp;
}
function code(x)
	tmp = 0.0
	if (hypot(1.0, x) <= 1.1)
		tmp = fma(0.125, Float64(x * x), Float64(-0.0859375 * (x ^ 4.0)));
	else
		tmp = Float64(1.0 - sqrt(Float64(0.5 + Float64(0.5 / hypot(1.0, x)))));
	end
	return tmp
end
code[x_] := If[LessEqual[N[Sqrt[1.0 ^ 2 + x ^ 2], $MachinePrecision], 1.1], N[(0.125 * N[(x * x), $MachinePrecision] + N[(-0.0859375 * N[Power[x, 4.0], $MachinePrecision]), $MachinePrecision]), $MachinePrecision], N[(1.0 - N[Sqrt[N[(0.5 + N[(0.5 / N[Sqrt[1.0 ^ 2 + x ^ 2], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]], $MachinePrecision]), $MachinePrecision]]
\begin{array}{l}

\\
\begin{array}{l}
\mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.1:\\
\;\;\;\;\mathsf{fma}\left(0.125, x \cdot x, -0.0859375 \cdot {x}^{4}\right)\\

\mathbf{else}:\\
\;\;\;\;1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\\


\end{array}
\end{array}
Derivation
  1. Split input into 2 regimes
  2. if (hypot.f64 1 x) < 1.1000000000000001

    1. Initial program 58.9%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in58.9%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval58.9%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/58.9%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval58.9%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified58.9%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Taylor expanded in x around 0 99.9%

      \[\leadsto \color{blue}{0.125 \cdot {x}^{2} + -0.0859375 \cdot {x}^{4}} \]
    5. Step-by-step derivation
      1. fma-def99.9%

        \[\leadsto \color{blue}{\mathsf{fma}\left(0.125, {x}^{2}, -0.0859375 \cdot {x}^{4}\right)} \]
      2. unpow299.9%

        \[\leadsto \mathsf{fma}\left(0.125, \color{blue}{x \cdot x}, -0.0859375 \cdot {x}^{4}\right) \]
    6. Simplified99.9%

      \[\leadsto \color{blue}{\mathsf{fma}\left(0.125, x \cdot x, -0.0859375 \cdot {x}^{4}\right)} \]

    if 1.1000000000000001 < (hypot.f64 1 x)

    1. Initial program 98.4%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in98.4%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval98.4%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/98.4%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval98.4%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified98.4%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
  3. Recombined 2 regimes into one program.
  4. Final simplification99.3%

    \[\leadsto \begin{array}{l} \mathbf{if}\;\mathsf{hypot}\left(1, x\right) \leq 1.1:\\ \;\;\;\;\mathsf{fma}\left(0.125, x \cdot x, -0.0859375 \cdot {x}^{4}\right)\\ \mathbf{else}:\\ \;\;\;\;1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\\ \end{array} \]

Alternative 8: 98.4% accurate, 1.8× speedup?

\[\begin{array}{l} \\ \begin{array}{l} \mathbf{if}\;x \leq -1.5:\\ \;\;\;\;\frac{0.5}{1 + \sqrt{0.5}}\\ \mathbf{elif}\;x \leq 1.25:\\ \;\;\;\;0.125 \cdot \left(x \cdot x\right)\\ \mathbf{else}:\\ \;\;\;\;1 + \left(1 - \left(1 + \sqrt{0.5 + \frac{0.5}{x}}\right)\right)\\ \end{array} \end{array} \]
(FPCore (x)
 :precision binary64
 (if (<= x -1.5)
   (/ 0.5 (+ 1.0 (sqrt 0.5)))
   (if (<= x 1.25)
     (* 0.125 (* x x))
     (+ 1.0 (- 1.0 (+ 1.0 (sqrt (+ 0.5 (/ 0.5 x)))))))))
double code(double x) {
	double tmp;
	if (x <= -1.5) {
		tmp = 0.5 / (1.0 + sqrt(0.5));
	} else if (x <= 1.25) {
		tmp = 0.125 * (x * x);
	} else {
		tmp = 1.0 + (1.0 - (1.0 + sqrt((0.5 + (0.5 / x)))));
	}
	return tmp;
}
real(8) function code(x)
    real(8), intent (in) :: x
    real(8) :: tmp
    if (x <= (-1.5d0)) then
        tmp = 0.5d0 / (1.0d0 + sqrt(0.5d0))
    else if (x <= 1.25d0) then
        tmp = 0.125d0 * (x * x)
    else
        tmp = 1.0d0 + (1.0d0 - (1.0d0 + sqrt((0.5d0 + (0.5d0 / x)))))
    end if
    code = tmp
end function
public static double code(double x) {
	double tmp;
	if (x <= -1.5) {
		tmp = 0.5 / (1.0 + Math.sqrt(0.5));
	} else if (x <= 1.25) {
		tmp = 0.125 * (x * x);
	} else {
		tmp = 1.0 + (1.0 - (1.0 + Math.sqrt((0.5 + (0.5 / x)))));
	}
	return tmp;
}
def code(x):
	tmp = 0
	if x <= -1.5:
		tmp = 0.5 / (1.0 + math.sqrt(0.5))
	elif x <= 1.25:
		tmp = 0.125 * (x * x)
	else:
		tmp = 1.0 + (1.0 - (1.0 + math.sqrt((0.5 + (0.5 / x)))))
	return tmp
function code(x)
	tmp = 0.0
	if (x <= -1.5)
		tmp = Float64(0.5 / Float64(1.0 + sqrt(0.5)));
	elseif (x <= 1.25)
		tmp = Float64(0.125 * Float64(x * x));
	else
		tmp = Float64(1.0 + Float64(1.0 - Float64(1.0 + sqrt(Float64(0.5 + Float64(0.5 / x))))));
	end
	return tmp
end
function tmp_2 = code(x)
	tmp = 0.0;
	if (x <= -1.5)
		tmp = 0.5 / (1.0 + sqrt(0.5));
	elseif (x <= 1.25)
		tmp = 0.125 * (x * x);
	else
		tmp = 1.0 + (1.0 - (1.0 + sqrt((0.5 + (0.5 / x)))));
	end
	tmp_2 = tmp;
end
code[x_] := If[LessEqual[x, -1.5], N[(0.5 / N[(1.0 + N[Sqrt[0.5], $MachinePrecision]), $MachinePrecision]), $MachinePrecision], If[LessEqual[x, 1.25], N[(0.125 * N[(x * x), $MachinePrecision]), $MachinePrecision], N[(1.0 + N[(1.0 - N[(1.0 + N[Sqrt[N[(0.5 + N[(0.5 / x), $MachinePrecision]), $MachinePrecision]], $MachinePrecision]), $MachinePrecision]), $MachinePrecision]), $MachinePrecision]]]
\begin{array}{l}

\\
\begin{array}{l}
\mathbf{if}\;x \leq -1.5:\\
\;\;\;\;\frac{0.5}{1 + \sqrt{0.5}}\\

\mathbf{elif}\;x \leq 1.25:\\
\;\;\;\;0.125 \cdot \left(x \cdot x\right)\\

\mathbf{else}:\\
\;\;\;\;1 + \left(1 - \left(1 + \sqrt{0.5 + \frac{0.5}{x}}\right)\right)\\


\end{array}
\end{array}
Derivation
  1. Split input into 3 regimes
  2. if x < -1.5

    1. Initial program 98.4%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in98.4%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval98.4%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/98.4%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval98.4%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified98.4%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Step-by-step derivation
      1. flip--98.5%

        \[\leadsto \color{blue}{\frac{1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
      2. div-inv98.4%

        \[\leadsto \color{blue}{\left(1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
      3. metadata-eval98.4%

        \[\leadsto \left(\color{blue}{1} - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. add-sqr-sqrt100.0%

        \[\leadsto \left(1 - \color{blue}{\left(0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      5. associate--r+100.0%

        \[\leadsto \color{blue}{\left(\left(1 - 0.5\right) - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      6. metadata-eval100.0%

        \[\leadsto \left(\color{blue}{0.5} - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    5. Applied egg-rr100.0%

      \[\leadsto \color{blue}{\left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    6. Step-by-step derivation
      1. *-commutative100.0%

        \[\leadsto \color{blue}{\frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \cdot \left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \]
      2. associate-/r/100.0%

        \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    7. Simplified100.0%

      \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    8. Taylor expanded in x around inf 97.7%

      \[\leadsto \color{blue}{\frac{0.5}{\sqrt{0.5} + 1}} \]

    if -1.5 < x < 1.25

    1. Initial program 59.2%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in59.2%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval59.2%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/59.2%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval59.2%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified59.2%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Taylor expanded in x around 0 99.0%

      \[\leadsto \color{blue}{0.125 \cdot {x}^{2}} \]
    5. Step-by-step derivation
      1. unpow299.0%

        \[\leadsto 0.125 \cdot \color{blue}{\left(x \cdot x\right)} \]
    6. Simplified99.0%

      \[\leadsto \color{blue}{0.125 \cdot \left(x \cdot x\right)} \]

    if 1.25 < x

    1. Initial program 98.5%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in98.5%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval98.5%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/98.5%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval98.5%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified98.5%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Taylor expanded in x around inf 97.6%

      \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5}{x}}} \]
    5. Step-by-step derivation
      1. expm1-log1p-u96.2%

        \[\leadsto 1 - \color{blue}{\mathsf{expm1}\left(\mathsf{log1p}\left(\sqrt{0.5 + \frac{0.5}{x}}\right)\right)} \]
      2. expm1-udef96.2%

        \[\leadsto 1 - \color{blue}{\left(e^{\mathsf{log1p}\left(\sqrt{0.5 + \frac{0.5}{x}}\right)} - 1\right)} \]
      3. log1p-udef97.6%

        \[\leadsto 1 - \left(e^{\color{blue}{\log \left(1 + \sqrt{0.5 + \frac{0.5}{x}}\right)}} - 1\right) \]
      4. add-exp-log97.6%

        \[\leadsto 1 - \left(\color{blue}{\left(1 + \sqrt{0.5 + \frac{0.5}{x}}\right)} - 1\right) \]
    6. Applied egg-rr97.6%

      \[\leadsto 1 - \color{blue}{\left(\left(1 + \sqrt{0.5 + \frac{0.5}{x}}\right) - 1\right)} \]
  3. Recombined 3 regimes into one program.
  4. Final simplification98.4%

    \[\leadsto \begin{array}{l} \mathbf{if}\;x \leq -1.5:\\ \;\;\;\;\frac{0.5}{1 + \sqrt{0.5}}\\ \mathbf{elif}\;x \leq 1.25:\\ \;\;\;\;0.125 \cdot \left(x \cdot x\right)\\ \mathbf{else}:\\ \;\;\;\;1 + \left(1 - \left(1 + \sqrt{0.5 + \frac{0.5}{x}}\right)\right)\\ \end{array} \]

Alternative 9: 98.5% accurate, 1.9× speedup?

\[\begin{array}{l} \\ \begin{array}{l} \mathbf{if}\;x \leq -1.5:\\ \;\;\;\;\frac{0.5}{1 + \sqrt{0.5}}\\ \mathbf{elif}\;x \leq 1.25:\\ \;\;\;\;0.125 \cdot \left(x \cdot x\right)\\ \mathbf{else}:\\ \;\;\;\;1 - \sqrt{0.5 + \frac{0.5}{x}}\\ \end{array} \end{array} \]
(FPCore (x)
 :precision binary64
 (if (<= x -1.5)
   (/ 0.5 (+ 1.0 (sqrt 0.5)))
   (if (<= x 1.25) (* 0.125 (* x x)) (- 1.0 (sqrt (+ 0.5 (/ 0.5 x)))))))
double code(double x) {
	double tmp;
	if (x <= -1.5) {
		tmp = 0.5 / (1.0 + sqrt(0.5));
	} else if (x <= 1.25) {
		tmp = 0.125 * (x * x);
	} else {
		tmp = 1.0 - sqrt((0.5 + (0.5 / x)));
	}
	return tmp;
}
real(8) function code(x)
    real(8), intent (in) :: x
    real(8) :: tmp
    if (x <= (-1.5d0)) then
        tmp = 0.5d0 / (1.0d0 + sqrt(0.5d0))
    else if (x <= 1.25d0) then
        tmp = 0.125d0 * (x * x)
    else
        tmp = 1.0d0 - sqrt((0.5d0 + (0.5d0 / x)))
    end if
    code = tmp
end function
public static double code(double x) {
	double tmp;
	if (x <= -1.5) {
		tmp = 0.5 / (1.0 + Math.sqrt(0.5));
	} else if (x <= 1.25) {
		tmp = 0.125 * (x * x);
	} else {
		tmp = 1.0 - Math.sqrt((0.5 + (0.5 / x)));
	}
	return tmp;
}
def code(x):
	tmp = 0
	if x <= -1.5:
		tmp = 0.5 / (1.0 + math.sqrt(0.5))
	elif x <= 1.25:
		tmp = 0.125 * (x * x)
	else:
		tmp = 1.0 - math.sqrt((0.5 + (0.5 / x)))
	return tmp
function code(x)
	tmp = 0.0
	if (x <= -1.5)
		tmp = Float64(0.5 / Float64(1.0 + sqrt(0.5)));
	elseif (x <= 1.25)
		tmp = Float64(0.125 * Float64(x * x));
	else
		tmp = Float64(1.0 - sqrt(Float64(0.5 + Float64(0.5 / x))));
	end
	return tmp
end
function tmp_2 = code(x)
	tmp = 0.0;
	if (x <= -1.5)
		tmp = 0.5 / (1.0 + sqrt(0.5));
	elseif (x <= 1.25)
		tmp = 0.125 * (x * x);
	else
		tmp = 1.0 - sqrt((0.5 + (0.5 / x)));
	end
	tmp_2 = tmp;
end
code[x_] := If[LessEqual[x, -1.5], N[(0.5 / N[(1.0 + N[Sqrt[0.5], $MachinePrecision]), $MachinePrecision]), $MachinePrecision], If[LessEqual[x, 1.25], N[(0.125 * N[(x * x), $MachinePrecision]), $MachinePrecision], N[(1.0 - N[Sqrt[N[(0.5 + N[(0.5 / x), $MachinePrecision]), $MachinePrecision]], $MachinePrecision]), $MachinePrecision]]]
\begin{array}{l}

\\
\begin{array}{l}
\mathbf{if}\;x \leq -1.5:\\
\;\;\;\;\frac{0.5}{1 + \sqrt{0.5}}\\

\mathbf{elif}\;x \leq 1.25:\\
\;\;\;\;0.125 \cdot \left(x \cdot x\right)\\

\mathbf{else}:\\
\;\;\;\;1 - \sqrt{0.5 + \frac{0.5}{x}}\\


\end{array}
\end{array}
Derivation
  1. Split input into 3 regimes
  2. if x < -1.5

    1. Initial program 98.4%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in98.4%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval98.4%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/98.4%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval98.4%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified98.4%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Step-by-step derivation
      1. flip--98.5%

        \[\leadsto \color{blue}{\frac{1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
      2. div-inv98.4%

        \[\leadsto \color{blue}{\left(1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
      3. metadata-eval98.4%

        \[\leadsto \left(\color{blue}{1} - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. add-sqr-sqrt100.0%

        \[\leadsto \left(1 - \color{blue}{\left(0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      5. associate--r+100.0%

        \[\leadsto \color{blue}{\left(\left(1 - 0.5\right) - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      6. metadata-eval100.0%

        \[\leadsto \left(\color{blue}{0.5} - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    5. Applied egg-rr100.0%

      \[\leadsto \color{blue}{\left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    6. Step-by-step derivation
      1. *-commutative100.0%

        \[\leadsto \color{blue}{\frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \cdot \left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \]
      2. associate-/r/100.0%

        \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    7. Simplified100.0%

      \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    8. Taylor expanded in x around inf 97.7%

      \[\leadsto \color{blue}{\frac{0.5}{\sqrt{0.5} + 1}} \]

    if -1.5 < x < 1.25

    1. Initial program 59.2%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in59.2%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval59.2%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/59.2%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval59.2%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified59.2%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Taylor expanded in x around 0 99.0%

      \[\leadsto \color{blue}{0.125 \cdot {x}^{2}} \]
    5. Step-by-step derivation
      1. unpow299.0%

        \[\leadsto 0.125 \cdot \color{blue}{\left(x \cdot x\right)} \]
    6. Simplified99.0%

      \[\leadsto \color{blue}{0.125 \cdot \left(x \cdot x\right)} \]

    if 1.25 < x

    1. Initial program 98.5%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in98.5%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval98.5%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/98.5%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval98.5%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified98.5%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Taylor expanded in x around inf 97.6%

      \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5}{x}}} \]
  3. Recombined 3 regimes into one program.
  4. Final simplification98.4%

    \[\leadsto \begin{array}{l} \mathbf{if}\;x \leq -1.5:\\ \;\;\;\;\frac{0.5}{1 + \sqrt{0.5}}\\ \mathbf{elif}\;x \leq 1.25:\\ \;\;\;\;0.125 \cdot \left(x \cdot x\right)\\ \mathbf{else}:\\ \;\;\;\;1 - \sqrt{0.5 + \frac{0.5}{x}}\\ \end{array} \]

Alternative 10: 98.4% accurate, 1.9× speedup?

\[\begin{array}{l} \\ \begin{array}{l} \mathbf{if}\;x \leq -1.5 \lor \neg \left(x \leq 1.55\right):\\ \;\;\;\;\frac{0.5}{1 + \sqrt{0.5}}\\ \mathbf{else}:\\ \;\;\;\;0.125 \cdot \left(x \cdot x\right)\\ \end{array} \end{array} \]
(FPCore (x)
 :precision binary64
 (if (or (<= x -1.5) (not (<= x 1.55)))
   (/ 0.5 (+ 1.0 (sqrt 0.5)))
   (* 0.125 (* x x))))
double code(double x) {
	double tmp;
	if ((x <= -1.5) || !(x <= 1.55)) {
		tmp = 0.5 / (1.0 + sqrt(0.5));
	} else {
		tmp = 0.125 * (x * x);
	}
	return tmp;
}
real(8) function code(x)
    real(8), intent (in) :: x
    real(8) :: tmp
    if ((x <= (-1.5d0)) .or. (.not. (x <= 1.55d0))) then
        tmp = 0.5d0 / (1.0d0 + sqrt(0.5d0))
    else
        tmp = 0.125d0 * (x * x)
    end if
    code = tmp
end function
public static double code(double x) {
	double tmp;
	if ((x <= -1.5) || !(x <= 1.55)) {
		tmp = 0.5 / (1.0 + Math.sqrt(0.5));
	} else {
		tmp = 0.125 * (x * x);
	}
	return tmp;
}
def code(x):
	tmp = 0
	if (x <= -1.5) or not (x <= 1.55):
		tmp = 0.5 / (1.0 + math.sqrt(0.5))
	else:
		tmp = 0.125 * (x * x)
	return tmp
function code(x)
	tmp = 0.0
	if ((x <= -1.5) || !(x <= 1.55))
		tmp = Float64(0.5 / Float64(1.0 + sqrt(0.5)));
	else
		tmp = Float64(0.125 * Float64(x * x));
	end
	return tmp
end
function tmp_2 = code(x)
	tmp = 0.0;
	if ((x <= -1.5) || ~((x <= 1.55)))
		tmp = 0.5 / (1.0 + sqrt(0.5));
	else
		tmp = 0.125 * (x * x);
	end
	tmp_2 = tmp;
end
code[x_] := If[Or[LessEqual[x, -1.5], N[Not[LessEqual[x, 1.55]], $MachinePrecision]], N[(0.5 / N[(1.0 + N[Sqrt[0.5], $MachinePrecision]), $MachinePrecision]), $MachinePrecision], N[(0.125 * N[(x * x), $MachinePrecision]), $MachinePrecision]]
\begin{array}{l}

\\
\begin{array}{l}
\mathbf{if}\;x \leq -1.5 \lor \neg \left(x \leq 1.55\right):\\
\;\;\;\;\frac{0.5}{1 + \sqrt{0.5}}\\

\mathbf{else}:\\
\;\;\;\;0.125 \cdot \left(x \cdot x\right)\\


\end{array}
\end{array}
Derivation
  1. Split input into 2 regimes
  2. if x < -1.5 or 1.55000000000000004 < x

    1. Initial program 98.5%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in98.5%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval98.5%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/98.5%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval98.5%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified98.5%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Step-by-step derivation
      1. flip--98.5%

        \[\leadsto \color{blue}{\frac{1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
      2. div-inv98.5%

        \[\leadsto \color{blue}{\left(1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
      3. metadata-eval98.5%

        \[\leadsto \left(\color{blue}{1} - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. add-sqr-sqrt100.0%

        \[\leadsto \left(1 - \color{blue}{\left(0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      5. associate--r+100.0%

        \[\leadsto \color{blue}{\left(\left(1 - 0.5\right) - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      6. metadata-eval100.0%

        \[\leadsto \left(\color{blue}{0.5} - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    5. Applied egg-rr100.0%

      \[\leadsto \color{blue}{\left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    6. Step-by-step derivation
      1. *-commutative100.0%

        \[\leadsto \color{blue}{\frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \cdot \left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \]
      2. associate-/r/100.0%

        \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    7. Simplified100.0%

      \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    8. Taylor expanded in x around inf 97.5%

      \[\leadsto \color{blue}{\frac{0.5}{\sqrt{0.5} + 1}} \]

    if -1.5 < x < 1.55000000000000004

    1. Initial program 59.2%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in59.2%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval59.2%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/59.2%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval59.2%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified59.2%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Taylor expanded in x around 0 99.0%

      \[\leadsto \color{blue}{0.125 \cdot {x}^{2}} \]
    5. Step-by-step derivation
      1. unpow299.0%

        \[\leadsto 0.125 \cdot \color{blue}{\left(x \cdot x\right)} \]
    6. Simplified99.0%

      \[\leadsto \color{blue}{0.125 \cdot \left(x \cdot x\right)} \]
  3. Recombined 2 regimes into one program.
  4. Final simplification98.3%

    \[\leadsto \begin{array}{l} \mathbf{if}\;x \leq -1.5 \lor \neg \left(x \leq 1.55\right):\\ \;\;\;\;\frac{0.5}{1 + \sqrt{0.5}}\\ \mathbf{else}:\\ \;\;\;\;0.125 \cdot \left(x \cdot x\right)\\ \end{array} \]

Alternative 11: 97.7% accurate, 2.0× speedup?

\[\begin{array}{l} \\ \begin{array}{l} \mathbf{if}\;x \leq -1.5 \lor \neg \left(x \leq 1.55\right):\\ \;\;\;\;1 - \sqrt{0.5}\\ \mathbf{else}:\\ \;\;\;\;0.125 \cdot \left(x \cdot x\right)\\ \end{array} \end{array} \]
(FPCore (x)
 :precision binary64
 (if (or (<= x -1.5) (not (<= x 1.55))) (- 1.0 (sqrt 0.5)) (* 0.125 (* x x))))
double code(double x) {
	double tmp;
	if ((x <= -1.5) || !(x <= 1.55)) {
		tmp = 1.0 - sqrt(0.5);
	} else {
		tmp = 0.125 * (x * x);
	}
	return tmp;
}
real(8) function code(x)
    real(8), intent (in) :: x
    real(8) :: tmp
    if ((x <= (-1.5d0)) .or. (.not. (x <= 1.55d0))) then
        tmp = 1.0d0 - sqrt(0.5d0)
    else
        tmp = 0.125d0 * (x * x)
    end if
    code = tmp
end function
public static double code(double x) {
	double tmp;
	if ((x <= -1.5) || !(x <= 1.55)) {
		tmp = 1.0 - Math.sqrt(0.5);
	} else {
		tmp = 0.125 * (x * x);
	}
	return tmp;
}
def code(x):
	tmp = 0
	if (x <= -1.5) or not (x <= 1.55):
		tmp = 1.0 - math.sqrt(0.5)
	else:
		tmp = 0.125 * (x * x)
	return tmp
function code(x)
	tmp = 0.0
	if ((x <= -1.5) || !(x <= 1.55))
		tmp = Float64(1.0 - sqrt(0.5));
	else
		tmp = Float64(0.125 * Float64(x * x));
	end
	return tmp
end
function tmp_2 = code(x)
	tmp = 0.0;
	if ((x <= -1.5) || ~((x <= 1.55)))
		tmp = 1.0 - sqrt(0.5);
	else
		tmp = 0.125 * (x * x);
	end
	tmp_2 = tmp;
end
code[x_] := If[Or[LessEqual[x, -1.5], N[Not[LessEqual[x, 1.55]], $MachinePrecision]], N[(1.0 - N[Sqrt[0.5], $MachinePrecision]), $MachinePrecision], N[(0.125 * N[(x * x), $MachinePrecision]), $MachinePrecision]]
\begin{array}{l}

\\
\begin{array}{l}
\mathbf{if}\;x \leq -1.5 \lor \neg \left(x \leq 1.55\right):\\
\;\;\;\;1 - \sqrt{0.5}\\

\mathbf{else}:\\
\;\;\;\;0.125 \cdot \left(x \cdot x\right)\\


\end{array}
\end{array}
Derivation
  1. Split input into 2 regimes
  2. if x < -1.5 or 1.55000000000000004 < x

    1. Initial program 98.5%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in98.5%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval98.5%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/98.5%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval98.5%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified98.5%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Taylor expanded in x around inf 96.0%

      \[\leadsto \color{blue}{1 - \sqrt{0.5}} \]

    if -1.5 < x < 1.55000000000000004

    1. Initial program 59.2%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in59.2%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval59.2%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/59.2%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval59.2%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified59.2%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Taylor expanded in x around 0 99.0%

      \[\leadsto \color{blue}{0.125 \cdot {x}^{2}} \]
    5. Step-by-step derivation
      1. unpow299.0%

        \[\leadsto 0.125 \cdot \color{blue}{\left(x \cdot x\right)} \]
    6. Simplified99.0%

      \[\leadsto \color{blue}{0.125 \cdot \left(x \cdot x\right)} \]
  3. Recombined 2 regimes into one program.
  4. Final simplification97.7%

    \[\leadsto \begin{array}{l} \mathbf{if}\;x \leq -1.5 \lor \neg \left(x \leq 1.55\right):\\ \;\;\;\;1 - \sqrt{0.5}\\ \mathbf{else}:\\ \;\;\;\;0.125 \cdot \left(x \cdot x\right)\\ \end{array} \]

Alternative 12: 58.8% accurate, 23.0× speedup?

\[\begin{array}{l} \\ \begin{array}{l} \mathbf{if}\;x \leq -1.2:\\ \;\;\;\;0.18181818181818182\\ \mathbf{elif}\;x \leq 1.2:\\ \;\;\;\;0.125 \cdot \left(x \cdot x\right)\\ \mathbf{else}:\\ \;\;\;\;0.18181818181818182\\ \end{array} \end{array} \]
(FPCore (x)
 :precision binary64
 (if (<= x -1.2)
   0.18181818181818182
   (if (<= x 1.2) (* 0.125 (* x x)) 0.18181818181818182)))
double code(double x) {
	double tmp;
	if (x <= -1.2) {
		tmp = 0.18181818181818182;
	} else if (x <= 1.2) {
		tmp = 0.125 * (x * x);
	} else {
		tmp = 0.18181818181818182;
	}
	return tmp;
}
real(8) function code(x)
    real(8), intent (in) :: x
    real(8) :: tmp
    if (x <= (-1.2d0)) then
        tmp = 0.18181818181818182d0
    else if (x <= 1.2d0) then
        tmp = 0.125d0 * (x * x)
    else
        tmp = 0.18181818181818182d0
    end if
    code = tmp
end function
public static double code(double x) {
	double tmp;
	if (x <= -1.2) {
		tmp = 0.18181818181818182;
	} else if (x <= 1.2) {
		tmp = 0.125 * (x * x);
	} else {
		tmp = 0.18181818181818182;
	}
	return tmp;
}
def code(x):
	tmp = 0
	if x <= -1.2:
		tmp = 0.18181818181818182
	elif x <= 1.2:
		tmp = 0.125 * (x * x)
	else:
		tmp = 0.18181818181818182
	return tmp
function code(x)
	tmp = 0.0
	if (x <= -1.2)
		tmp = 0.18181818181818182;
	elseif (x <= 1.2)
		tmp = Float64(0.125 * Float64(x * x));
	else
		tmp = 0.18181818181818182;
	end
	return tmp
end
function tmp_2 = code(x)
	tmp = 0.0;
	if (x <= -1.2)
		tmp = 0.18181818181818182;
	elseif (x <= 1.2)
		tmp = 0.125 * (x * x);
	else
		tmp = 0.18181818181818182;
	end
	tmp_2 = tmp;
end
code[x_] := If[LessEqual[x, -1.2], 0.18181818181818182, If[LessEqual[x, 1.2], N[(0.125 * N[(x * x), $MachinePrecision]), $MachinePrecision], 0.18181818181818182]]
\begin{array}{l}

\\
\begin{array}{l}
\mathbf{if}\;x \leq -1.2:\\
\;\;\;\;0.18181818181818182\\

\mathbf{elif}\;x \leq 1.2:\\
\;\;\;\;0.125 \cdot \left(x \cdot x\right)\\

\mathbf{else}:\\
\;\;\;\;0.18181818181818182\\


\end{array}
\end{array}
Derivation
  1. Split input into 2 regimes
  2. if x < -1.19999999999999996 or 1.19999999999999996 < x

    1. Initial program 98.5%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in98.5%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval98.5%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/98.5%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval98.5%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified98.5%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Step-by-step derivation
      1. flip--98.5%

        \[\leadsto \color{blue}{\frac{1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
      2. div-inv98.5%

        \[\leadsto \color{blue}{\left(1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
      3. metadata-eval98.5%

        \[\leadsto \left(\color{blue}{1} - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. add-sqr-sqrt100.0%

        \[\leadsto \left(1 - \color{blue}{\left(0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      5. associate--r+100.0%

        \[\leadsto \color{blue}{\left(\left(1 - 0.5\right) - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      6. metadata-eval100.0%

        \[\leadsto \left(\color{blue}{0.5} - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    5. Applied egg-rr100.0%

      \[\leadsto \color{blue}{\left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    6. Step-by-step derivation
      1. *-commutative100.0%

        \[\leadsto \color{blue}{\frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \cdot \left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \]
      2. associate-/r/100.0%

        \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    7. Simplified100.0%

      \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    8. Taylor expanded in x around 0 19.5%

      \[\leadsto \frac{1}{\color{blue}{5.5 + 8 \cdot \frac{1}{{x}^{2}}}} \]
    9. Step-by-step derivation
      1. associate-*r/19.5%

        \[\leadsto \frac{1}{5.5 + \color{blue}{\frac{8 \cdot 1}{{x}^{2}}}} \]
      2. metadata-eval19.5%

        \[\leadsto \frac{1}{5.5 + \frac{\color{blue}{8}}{{x}^{2}}} \]
      3. unpow219.5%

        \[\leadsto \frac{1}{5.5 + \frac{8}{\color{blue}{x \cdot x}}} \]
    10. Simplified19.5%

      \[\leadsto \frac{1}{\color{blue}{5.5 + \frac{8}{x \cdot x}}} \]
    11. Taylor expanded in x around inf 19.5%

      \[\leadsto \color{blue}{0.18181818181818182} \]

    if -1.19999999999999996 < x < 1.19999999999999996

    1. Initial program 59.2%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in59.2%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval59.2%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/59.2%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval59.2%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified59.2%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Taylor expanded in x around 0 99.0%

      \[\leadsto \color{blue}{0.125 \cdot {x}^{2}} \]
    5. Step-by-step derivation
      1. unpow299.0%

        \[\leadsto 0.125 \cdot \color{blue}{\left(x \cdot x\right)} \]
    6. Simplified99.0%

      \[\leadsto \color{blue}{0.125 \cdot \left(x \cdot x\right)} \]
  3. Recombined 2 regimes into one program.
  4. Final simplification64.2%

    \[\leadsto \begin{array}{l} \mathbf{if}\;x \leq -1.2:\\ \;\;\;\;0.18181818181818182\\ \mathbf{elif}\;x \leq 1.2:\\ \;\;\;\;0.125 \cdot \left(x \cdot x\right)\\ \mathbf{else}:\\ \;\;\;\;0.18181818181818182\\ \end{array} \]

Alternative 13: 58.4% accurate, 23.3× speedup?

\[\begin{array}{l} \\ \frac{1}{5.5 + \frac{8}{x \cdot x}} \end{array} \]
(FPCore (x) :precision binary64 (/ 1.0 (+ 5.5 (/ 8.0 (* x x)))))
double code(double x) {
	return 1.0 / (5.5 + (8.0 / (x * x)));
}
real(8) function code(x)
    real(8), intent (in) :: x
    code = 1.0d0 / (5.5d0 + (8.0d0 / (x * x)))
end function
public static double code(double x) {
	return 1.0 / (5.5 + (8.0 / (x * x)));
}
def code(x):
	return 1.0 / (5.5 + (8.0 / (x * x)))
function code(x)
	return Float64(1.0 / Float64(5.5 + Float64(8.0 / Float64(x * x))))
end
function tmp = code(x)
	tmp = 1.0 / (5.5 + (8.0 / (x * x)));
end
code[x_] := N[(1.0 / N[(5.5 + N[(8.0 / N[(x * x), $MachinePrecision]), $MachinePrecision]), $MachinePrecision]), $MachinePrecision]
\begin{array}{l}

\\
\frac{1}{5.5 + \frac{8}{x \cdot x}}
\end{array}
Derivation
  1. Initial program 76.4%

    \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
  2. Step-by-step derivation
    1. distribute-lft-in76.4%

      \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
    2. metadata-eval76.4%

      \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
    3. associate-*r/76.4%

      \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. metadata-eval76.4%

      \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
  3. Simplified76.4%

    \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
  4. Step-by-step derivation
    1. flip--76.4%

      \[\leadsto \color{blue}{\frac{1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    2. div-inv76.4%

      \[\leadsto \color{blue}{\left(1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    3. metadata-eval76.4%

      \[\leadsto \left(\color{blue}{1} - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. add-sqr-sqrt77.1%

      \[\leadsto \left(1 - \color{blue}{\left(0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    5. associate--r+77.1%

      \[\leadsto \color{blue}{\left(\left(1 - 0.5\right) - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    6. metadata-eval77.1%

      \[\leadsto \left(\color{blue}{0.5} - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
  5. Applied egg-rr77.1%

    \[\leadsto \color{blue}{\left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
  6. Step-by-step derivation
    1. *-commutative77.1%

      \[\leadsto \color{blue}{\frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \cdot \left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. associate-/r/77.1%

      \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
  7. Simplified77.1%

    \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
  8. Taylor expanded in x around 0 64.0%

    \[\leadsto \frac{1}{\color{blue}{5.5 + 8 \cdot \frac{1}{{x}^{2}}}} \]
  9. Step-by-step derivation
    1. associate-*r/64.0%

      \[\leadsto \frac{1}{5.5 + \color{blue}{\frac{8 \cdot 1}{{x}^{2}}}} \]
    2. metadata-eval64.0%

      \[\leadsto \frac{1}{5.5 + \frac{\color{blue}{8}}{{x}^{2}}} \]
    3. unpow264.0%

      \[\leadsto \frac{1}{5.5 + \frac{8}{\color{blue}{x \cdot x}}} \]
  10. Simplified64.0%

    \[\leadsto \frac{1}{\color{blue}{5.5 + \frac{8}{x \cdot x}}} \]
  11. Final simplification64.0%

    \[\leadsto \frac{1}{5.5 + \frac{8}{x \cdot x}} \]

Alternative 14: 35.2% accurate, 41.0× speedup?

\[\begin{array}{l} \\ \begin{array}{l} \mathbf{if}\;x \leq -1.9 \cdot 10^{-77}:\\ \;\;\;\;0.18181818181818182\\ \mathbf{elif}\;x \leq 1.9 \cdot 10^{-77}:\\ \;\;\;\;0\\ \mathbf{else}:\\ \;\;\;\;0.18181818181818182\\ \end{array} \end{array} \]
(FPCore (x)
 :precision binary64
 (if (<= x -1.9e-77)
   0.18181818181818182
   (if (<= x 1.9e-77) 0.0 0.18181818181818182)))
double code(double x) {
	double tmp;
	if (x <= -1.9e-77) {
		tmp = 0.18181818181818182;
	} else if (x <= 1.9e-77) {
		tmp = 0.0;
	} else {
		tmp = 0.18181818181818182;
	}
	return tmp;
}
real(8) function code(x)
    real(8), intent (in) :: x
    real(8) :: tmp
    if (x <= (-1.9d-77)) then
        tmp = 0.18181818181818182d0
    else if (x <= 1.9d-77) then
        tmp = 0.0d0
    else
        tmp = 0.18181818181818182d0
    end if
    code = tmp
end function
public static double code(double x) {
	double tmp;
	if (x <= -1.9e-77) {
		tmp = 0.18181818181818182;
	} else if (x <= 1.9e-77) {
		tmp = 0.0;
	} else {
		tmp = 0.18181818181818182;
	}
	return tmp;
}
def code(x):
	tmp = 0
	if x <= -1.9e-77:
		tmp = 0.18181818181818182
	elif x <= 1.9e-77:
		tmp = 0.0
	else:
		tmp = 0.18181818181818182
	return tmp
function code(x)
	tmp = 0.0
	if (x <= -1.9e-77)
		tmp = 0.18181818181818182;
	elseif (x <= 1.9e-77)
		tmp = 0.0;
	else
		tmp = 0.18181818181818182;
	end
	return tmp
end
function tmp_2 = code(x)
	tmp = 0.0;
	if (x <= -1.9e-77)
		tmp = 0.18181818181818182;
	elseif (x <= 1.9e-77)
		tmp = 0.0;
	else
		tmp = 0.18181818181818182;
	end
	tmp_2 = tmp;
end
code[x_] := If[LessEqual[x, -1.9e-77], 0.18181818181818182, If[LessEqual[x, 1.9e-77], 0.0, 0.18181818181818182]]
\begin{array}{l}

\\
\begin{array}{l}
\mathbf{if}\;x \leq -1.9 \cdot 10^{-77}:\\
\;\;\;\;0.18181818181818182\\

\mathbf{elif}\;x \leq 1.9 \cdot 10^{-77}:\\
\;\;\;\;0\\

\mathbf{else}:\\
\;\;\;\;0.18181818181818182\\


\end{array}
\end{array}
Derivation
  1. Split input into 2 regimes
  2. if x < -1.8999999999999999e-77 or 1.8999999999999999e-77 < x

    1. Initial program 79.3%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in79.3%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval79.3%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/79.3%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval79.3%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified79.3%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Step-by-step derivation
      1. flip--79.3%

        \[\leadsto \color{blue}{\frac{1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
      2. div-inv79.3%

        \[\leadsto \color{blue}{\left(1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
      3. metadata-eval79.3%

        \[\leadsto \left(\color{blue}{1} - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. add-sqr-sqrt80.5%

        \[\leadsto \left(1 - \color{blue}{\left(0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      5. associate--r+80.6%

        \[\leadsto \color{blue}{\left(\left(1 - 0.5\right) - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
      6. metadata-eval80.6%

        \[\leadsto \left(\color{blue}{0.5} - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    5. Applied egg-rr80.6%

      \[\leadsto \color{blue}{\left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    6. Step-by-step derivation
      1. *-commutative80.6%

        \[\leadsto \color{blue}{\frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \cdot \left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \]
      2. associate-/r/80.6%

        \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    7. Simplified80.6%

      \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    8. Taylor expanded in x around 0 36.3%

      \[\leadsto \frac{1}{\color{blue}{5.5 + 8 \cdot \frac{1}{{x}^{2}}}} \]
    9. Step-by-step derivation
      1. associate-*r/36.3%

        \[\leadsto \frac{1}{5.5 + \color{blue}{\frac{8 \cdot 1}{{x}^{2}}}} \]
      2. metadata-eval36.3%

        \[\leadsto \frac{1}{5.5 + \frac{\color{blue}{8}}{{x}^{2}}} \]
      3. unpow236.3%

        \[\leadsto \frac{1}{5.5 + \frac{8}{\color{blue}{x \cdot x}}} \]
    10. Simplified36.3%

      \[\leadsto \frac{1}{\color{blue}{5.5 + \frac{8}{x \cdot x}}} \]
    11. Taylor expanded in x around inf 16.8%

      \[\leadsto \color{blue}{0.18181818181818182} \]

    if -1.8999999999999999e-77 < x < 1.8999999999999999e-77

    1. Initial program 72.6%

      \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. Step-by-step derivation
      1. distribute-lft-in72.6%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
      2. metadata-eval72.6%

        \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
      3. associate-*r/72.6%

        \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
      4. metadata-eval72.6%

        \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
    3. Simplified72.6%

      \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. Taylor expanded in x around 0 72.6%

      \[\leadsto 1 - \color{blue}{1} \]
  3. Recombined 2 regimes into one program.
  4. Final simplification41.4%

    \[\leadsto \begin{array}{l} \mathbf{if}\;x \leq -1.9 \cdot 10^{-77}:\\ \;\;\;\;0.18181818181818182\\ \mathbf{elif}\;x \leq 1.9 \cdot 10^{-77}:\\ \;\;\;\;0\\ \mathbf{else}:\\ \;\;\;\;0.18181818181818182\\ \end{array} \]

Alternative 15: 12.0% accurate, 210.0× speedup?

\[\begin{array}{l} \\ 0.18181818181818182 \end{array} \]
(FPCore (x) :precision binary64 0.18181818181818182)
double code(double x) {
	return 0.18181818181818182;
}
real(8) function code(x)
    real(8), intent (in) :: x
    code = 0.18181818181818182d0
end function
public static double code(double x) {
	return 0.18181818181818182;
}
def code(x):
	return 0.18181818181818182
function code(x)
	return 0.18181818181818182
end
function tmp = code(x)
	tmp = 0.18181818181818182;
end
code[x_] := 0.18181818181818182
\begin{array}{l}

\\
0.18181818181818182
\end{array}
Derivation
  1. Initial program 76.4%

    \[1 - \sqrt{0.5 \cdot \left(1 + \frac{1}{\mathsf{hypot}\left(1, x\right)}\right)} \]
  2. Step-by-step derivation
    1. distribute-lft-in76.4%

      \[\leadsto 1 - \sqrt{\color{blue}{0.5 \cdot 1 + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}}} \]
    2. metadata-eval76.4%

      \[\leadsto 1 - \sqrt{\color{blue}{0.5} + 0.5 \cdot \frac{1}{\mathsf{hypot}\left(1, x\right)}} \]
    3. associate-*r/76.4%

      \[\leadsto 1 - \sqrt{0.5 + \color{blue}{\frac{0.5 \cdot 1}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. metadata-eval76.4%

      \[\leadsto 1 - \sqrt{0.5 + \frac{\color{blue}{0.5}}{\mathsf{hypot}\left(1, x\right)}} \]
  3. Simplified76.4%

    \[\leadsto \color{blue}{1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
  4. Step-by-step derivation
    1. flip--76.4%

      \[\leadsto \color{blue}{\frac{1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    2. div-inv76.4%

      \[\leadsto \color{blue}{\left(1 \cdot 1 - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
    3. metadata-eval76.4%

      \[\leadsto \left(\color{blue}{1} - \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}} \cdot \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    4. add-sqr-sqrt77.1%

      \[\leadsto \left(1 - \color{blue}{\left(0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    5. associate--r+77.1%

      \[\leadsto \color{blue}{\left(\left(1 - 0.5\right) - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
    6. metadata-eval77.1%

      \[\leadsto \left(\color{blue}{0.5} - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \]
  5. Applied egg-rr77.1%

    \[\leadsto \color{blue}{\left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right) \cdot \frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
  6. Step-by-step derivation
    1. *-commutative77.1%

      \[\leadsto \color{blue}{\frac{1}{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}} \cdot \left(0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}\right)} \]
    2. associate-/r/77.1%

      \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
  7. Simplified77.1%

    \[\leadsto \color{blue}{\frac{1}{\frac{1 + \sqrt{0.5 + \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}{0.5 - \frac{0.5}{\mathsf{hypot}\left(1, x\right)}}}} \]
  8. Taylor expanded in x around 0 64.0%

    \[\leadsto \frac{1}{\color{blue}{5.5 + 8 \cdot \frac{1}{{x}^{2}}}} \]
  9. Step-by-step derivation
    1. associate-*r/64.0%

      \[\leadsto \frac{1}{5.5 + \color{blue}{\frac{8 \cdot 1}{{x}^{2}}}} \]
    2. metadata-eval64.0%

      \[\leadsto \frac{1}{5.5 + \frac{\color{blue}{8}}{{x}^{2}}} \]
    3. unpow264.0%

      \[\leadsto \frac{1}{5.5 + \frac{8}{\color{blue}{x \cdot x}}} \]
  10. Simplified64.0%

    \[\leadsto \frac{1}{\color{blue}{5.5 + \frac{8}{x \cdot x}}} \]
  11. Taylor expanded in x around inf 10.8%

    \[\leadsto \color{blue}{0.18181818181818182} \]
  12. Final simplification10.8%

    \[\leadsto 0.18181818181818182 \]

Reproduce

?
herbie shell --seed 2023182 
(FPCore (x)
  :name "Given's Rotation SVD example, simplified"
  :precision binary64
  (- 1.0 (sqrt (* 0.5 (+ 1.0 (/ 1.0 (hypot 1.0 x)))))))