Open
Description
When I run the code for the quadratic approximation (attached below), I get a SD that is 4 times larger than the one in the book (0.64, instead of 0.16 as expected). Similarly, the "approximations" of the binomial with a gaussian have a much-too-large SD. Since I don't have an equation to compare, I assume that there is a mistake in the current code:
data = np.repeat((0, 1), (3, 6))
with pm.Model() as normal_approximation:
p = pm.Uniform("p", 0, 1) # uniform priors
w = pm.Binomial("w", n=len(data), p=p, observed=data.sum()) # binomial likelihood
mean_q = pm.find_MAP()
p_value = normal_approximation.rvs_to_values[p]
p_value.tag.transform = None
p_value.name = p.name
std_q = ((1 / pm.find_hessian(mean_q, vars=[p])) ** 0.5)[0]
# display summary of quadratic approximation
print("Mean, Standard deviation\np {:.2}, {:.2}".format(mean_q["p"], std_q[0]))
produces on my computer
Mean, Standard deviation
p 0.67, 0.64
instead of the expected SD=0.16
I have been running the code on Win64, Python 3.10, PyMC 4
Metadata
Metadata
Assignees
Labels
No labels