You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello Hugh,
i come again to discuss with you another subject around the doubly stochastic DGP.
When predicting using the doubly stochastic DGP you suggest a Gaussian mixture of the variational posteriors obtained by drawing S samples through the DGP. However, using a mixture of Gaussians would make the variance of the distribution obtained tend to zero when we increase the number of samples S (eq 18 in your paper doubly stochastic DGP). Hence, is it a reliable approach to handle the uncertainty quantification in DGPs ?
The text was updated successfully, but these errors were encountered:
I think you're mixing up averaging over the variables and averaging over the densities: in eq 18 the averaging is over densities, not variables. To obtain the empirical mean and variance from samples you can use the formula (from e.g. here)
ms, vs=model.predict_y(Xs, S) # each is S, N, Dy# the first two momentsm=np.average(ms, 0)
v=np.average(vs+ms**2, 0) -m**2
Hello Hugh,
i come again to discuss with you another subject around the doubly stochastic DGP.
When predicting using the doubly stochastic DGP you suggest a Gaussian mixture of the variational posteriors obtained by drawing S samples through the DGP. However, using a mixture of Gaussians would make the variance of the distribution obtained tend to zero when we increase the number of samples S (eq 18 in your paper doubly stochastic DGP). Hence, is it a reliable approach to handle the uncertainty quantification in DGPs ?
The text was updated successfully, but these errors were encountered: