sphinx-quickstart on Sat Aug 13 12:39:28 2022.
Welcome to JAXNS’s documentation!
JAXNS is a probabilistic programming framework, built on top of JAX for high performance. Initially, JAXNS was designed for nested sampling powered Bayesian computations, however it has since grown into a full-fledged probabilistic programming framework.
Here are some of the things you can do with JAXNS:
Build Bayesian models in an easy to use, high-level language.
Compute and sample the posterior distribution of your model.
Compute the Bayesian evidence of your model.
Use deep learning models in your Bayesian models.
Use Bayesian models in your deep learning models.
Maximise the Bayesian evidence of your model.
Global optimisation (maximum likelihood determination).
Use JAX’s automatic differentiation to compute gradients of your model.
JAXNS’s Mission Statement
Our mission is to make nested sampling faster, easier, and more powerful.
- Bayesian computations with Neural Networks
- Inference of Jones scalars observables (noisy angular quantities)
- Lennard-Jones Potentials for modelling phase transitions in materials
- Constant Likelihood
- Dual Moons likelihood
- Egg-box Likelihood with Uniform Prior
- Evidence Maximisation
- Generate data
- Define the model with parameters
- Poisson likelihood and Gamma prior
- Gaussian processes with outliers
- Gaussian processes with outliers
- Thin Gaussian Shells with Uniform Prior
- Multivariate Normal Likelihood with Multivariate Normal Prior
- Measuring placebo effect on hunger
- Simulate Data
- Define likelihood
- Make a model that is unaware of the effect
- Logic rules
- Self-Exciting process (Hawkes process)