?

Average Accuracy: 0.0% → 99.5%
Time: 11.4s
Precision: binary64
Cost: 6592

?

\[1.9 \leq t \land t \leq 2.1\]
\[1.7 \cdot 10^{+308} \cdot t - 1.7 \cdot 10^{+308} \]
\[\mathsf{fma}\left(1.7 \cdot 10^{+308}, t, -1.7 \cdot 10^{+308}\right) \]
(FPCore (t) :precision binary64 (- (* 1.7e+308 t) 1.7e+308))
(FPCore (t) :precision binary64 (fma 1.7e+308 t -1.7e+308))
double code(double t) {
	return (1.7e+308 * t) - 1.7e+308;
}
double code(double t) {
	return fma(1.7e+308, t, -1.7e+308);
}
function code(t)
	return Float64(Float64(1.7e+308 * t) - 1.7e+308)
end
function code(t)
	return fma(1.7e+308, t, -1.7e+308)
end
code[t_] := N[(N[(1.7e+308 * t), $MachinePrecision] - 1.7e+308), $MachinePrecision]
code[t_] := N[(1.7e+308 * t + -1.7e+308), $MachinePrecision]
1.7 \cdot 10^{+308} \cdot t - 1.7 \cdot 10^{+308}
\mathsf{fma}\left(1.7 \cdot 10^{+308}, t, -1.7 \cdot 10^{+308}\right)

Error?

Target

Original0.0%
Target99.5%
Herbie99.5%
\[\mathsf{fma}\left(1.7 \cdot 10^{+308}, t, -1.7 \cdot 10^{+308}\right) \]

Derivation?

  1. Initial program 0.0%

    \[1.7 \cdot 10^{+308} \cdot t - 1.7 \cdot 10^{+308} \]
  2. Simplified99.5%

    \[\leadsto \color{blue}{\mathsf{fma}\left(1.7 \cdot 10^{+308}, t, -1.7 \cdot 10^{+308}\right)} \]
    Proof

    [Start]0.0

    \[ 1.7 \cdot 10^{+308} \cdot t - 1.7 \cdot 10^{+308} \]

    fma-neg [=>]99.5

    \[ \color{blue}{\mathsf{fma}\left(1.7 \cdot 10^{+308}, t, -1.7 \cdot 10^{+308}\right)} \]

    metadata-eval [=>]99.5

    \[ \mathsf{fma}\left(1.7 \cdot 10^{+308}, t, \color{blue}{-1.7 \cdot 10^{+308}}\right) \]
  3. Final simplification99.5%

    \[\leadsto \mathsf{fma}\left(1.7 \cdot 10^{+308}, t, -1.7 \cdot 10^{+308}\right) \]

Alternatives

Alternative 1
Accuracy0.0%
Cost64
\[-1.7 \cdot 10^{+308} \]

Error

Reproduce?

herbie shell --seed 2023122 
(FPCore (t)
  :name "fma_test2"
  :precision binary64
  :pre (and (<= 1.9 t) (<= t 2.1))

  :herbie-target
  (fma 1.7e+308 t (- 1.7e+308))

  (- (* 1.7e+308 t) 1.7e+308))