Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Variance computation #35

Open
Hebbalali opened this issue Feb 13, 2019 · 1 comment
Open

Variance computation #35

Hebbalali opened this issue Feb 13, 2019 · 1 comment

Comments

@Hebbalali
Copy link

Hello Hugh,
i come again to discuss with you another subject around the doubly stochastic DGP.
When predicting using the doubly stochastic DGP you suggest a Gaussian mixture of the variational posteriors obtained by drawing S samples through the DGP. However, using a mixture of Gaussians would make the variance of the distribution obtained tend to zero when we increase the number of samples S (eq 18 in your paper doubly stochastic DGP). Hence, is it a reliable approach to handle the uncertainty quantification in DGPs ?

@hughsalimbeni
Copy link
Collaborator

I think you're mixing up averaging over the variables and averaging over the densities: in eq 18 the averaging is over densities, not variables. To obtain the empirical mean and variance from samples you can use the formula (from e.g. here)

ms, vs = model.predict_y(Xs, S)   # each is S, N, Dy

# the first two moments
m = np.average(ms, 0)
v = np.average(vs + ms**2, 0) - m**2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants