-
Updated
Jun 23, 2021 - Python
bayesian-inference
Here are 1,024 public repositories matching this topic...
-
Updated
Jul 26, 2021 - Python
var_context builder
Summary:
It'd be nice to have a builder pattern for var contexts to make them easy to construct for testing. Something that could be used like this:
MatrixXd m(3, 2);
...
var_context vc
= var_context::builder()
.matrix("a", m)
.real("f", 2.3)
.build();
Current Version:
v2.23.0
-
Updated
Jan 9, 2020 - Python
-
Updated
Jun 29, 2021 - Jupyter Notebook
-
Updated
Jul 26, 2021 - C#
-
Updated
Jul 18, 2021 - Julia
-
Updated
Jun 19, 2021 - Python
Quoted from the forum: The current wrappers in numpyro.contrib.tfp
seems to not work for special distributions like MixtureSameFamily. For now, to make it work, we'll need to do
from numpyro.contrib.tfp.distributions import TFPDistribution
import tensorflow_probability...distributions as tfd
def mo
trace_to_dataframe()
in PyMC3 to save traces is currently implemented in Rethinking_2 notebooks (e.g. Chp_04). But the function is planned for deprecation, with Arviz being the intended package to save traces. As per this comment by @AlexAndorra, Arviz's InferenceData format is a superior replacement to this function as it
Hi @JavierAntoran @stratisMarkou,
First of all, thanks for making all of this code available - it's been great to look through!
Im currently spending some time trying to work through the Weight Uncertainty in Neural Networks in order to implement Bayes-by-Backprop. I was struggling to understand the difference between your implementation of `Bayes-by-Bac
Hi,
is there any plan to implement the Generalized Pareto Distribution in brms
(paul-buerkner/brms#110 (comment))? I am playing around with an extreme values analysis and it looks like extremes collected as Peak Over Threshold are better represented by the GPD instead of the generalized extreme value distribution, which I am so happy to see already in `b
-
Updated
Feb 5, 2021 - Python
-
Updated
Jul 19, 2021 - Jupyter Notebook
-
Updated
Jul 26, 2021 - C++
-
Updated
May 27, 2021
-
Updated
Jul 22, 2020 - C++
-
Updated
Jul 16, 2021 - Julia
-
Updated
Jul 19, 2021 - Jupyter Notebook
-
Updated
Oct 26, 2020 - Jupyter Notebook
-
Updated
Jul 19, 2021 - Jupyter Notebook
-
Updated
Apr 18, 2021 - R
-
Updated
Feb 19, 2018 - Jupyter Notebook
-
Updated
May 3, 2021 - HTML
-
Updated
Sep 10, 2020 - Clojure
-
Updated
Oct 12, 2019 - Python
-
Updated
Jun 21, 2021 - R
-
Updated
Jul 22, 2021 - Julia
There are a variety of interesting optimisations that can be performed on kernels of the form
k(x, z) = w_1 * k_1(x, z) + w_2 * k_2(x, z) + ... + w_L k_L(x, z)
A naive recursive implementation in terms of the current Sum
and Scaled
kernels hides opportunities for parallelism in the computation of each term, and the summation over terms.
Notable examples of kernels with th
Plotting Docs
GPU Support
Improve this page
Add a description, image, and links to the bayesian-inference topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the bayesian-inference topic, visit your repo's landing page and select "manage topics."
Pyro's HMC and NUTS implementations are feature-complete and well-tested, but they are quite slow in models like the one in our Bayesian regression tutorial that operate on small tensors for reasons that are largely beyond our control (mostly having to do with the design and implementation of
torch.autograd
), which is unfortunate because these