?

Average Error: 0.3 → 0.3
Time: 4.6s
Precision: binary64
Cost: 6784

?

\[x \cdot \log x \]
\[\log \left(\frac{1}{x}\right) \cdot \left(-x\right) \]
(FPCore (x) :precision binary64 (* x (log x)))
(FPCore (x) :precision binary64 (* (log (/ 1.0 x)) (- x)))
double code(double x) {
	return x * log(x);
}
double code(double x) {
	return log((1.0 / x)) * -x;
}
real(8) function code(x)
    real(8), intent (in) :: x
    code = x * log(x)
end function
real(8) function code(x)
    real(8), intent (in) :: x
    code = log((1.0d0 / x)) * -x
end function
public static double code(double x) {
	return x * Math.log(x);
}
public static double code(double x) {
	return Math.log((1.0 / x)) * -x;
}
def code(x):
	return x * math.log(x)
def code(x):
	return math.log((1.0 / x)) * -x
function code(x)
	return Float64(x * log(x))
end
function code(x)
	return Float64(log(Float64(1.0 / x)) * Float64(-x))
end
function tmp = code(x)
	tmp = x * log(x);
end
function tmp = code(x)
	tmp = log((1.0 / x)) * -x;
end
code[x_] := N[(x * N[Log[x], $MachinePrecision]), $MachinePrecision]
code[x_] := N[(N[Log[N[(1.0 / x), $MachinePrecision]], $MachinePrecision] * (-x)), $MachinePrecision]
x \cdot \log x
\log \left(\frac{1}{x}\right) \cdot \left(-x\right)

Error?

Try it out?

Your Program's Arguments

Results

Enter valid numbers for all inputs

Derivation?

  1. Initial program 0.3

    \[x \cdot \log x \]
  2. Taylor expanded in x around inf 0.3

    \[\leadsto \color{blue}{-1 \cdot \left(\log \left(\frac{1}{x}\right) \cdot x\right)} \]
  3. Final simplification0.3

    \[\leadsto \log \left(\frac{1}{x}\right) \cdot \left(-x\right) \]

Alternatives

Alternative 1
Error0.3
Cost6592
\[x \cdot \log x \]
Alternative 2
Error61.6
Cost64
\[0 \]

Error

Reproduce?

herbie shell --seed 2023039 
(FPCore (x)
  :name "Statistics.Distribution.Binomial:directEntropy from math-functions-0.1.5.2"
  :precision binary64
  (* x (log x)))