global_optimisation

jaxns.experimental.global_optimisation

Module Contents

class GlobalOptimisationState[source]

Bases: NamedTuple

key: jaxns.internals.types.PRNGKey[source]
samples: jaxns.internals.types.Sample[source]
num_likelihood_evaluations: jaxns.internals.types.IntArray[source]
num_samples: jaxns.internals.types.IntArray[source]
class GlobalOptimisationResults[source]

Bases: NamedTuple

U_solution: jaxns.internals.types.UType[source]
X_solution: jaxns.internals.types.XType[source]
solution: jaxns.internals.types.LikelihoodInputType[source]
log_L_solution: jaxns.internals.types.FloatArray[source]
num_likelihood_evaluations: jaxns.internals.types.IntArray[source]
num_samples: jaxns.internals.types.IntArray[source]
termination_reason: jaxns.internals.types.IntArray[source]
relative_spread: jaxns.internals.types.FloatArray[source]
absolute_spread: jaxns.internals.types.FloatArray[source]
class GlobalOptimisationTerminationCondition[source]

Bases: NamedTuple

max_likelihood_evaluations: jaxns.internals.types.IntArray | int | None[source]
log_likelihood_contour: jaxns.internals.types.FloatArray | float | None[source]
rtol: jaxns.internals.types.FloatArray | float | None[source]
atol: jaxns.internals.types.FloatArray | float | None[source]
min_efficiency: jaxns.internals.types.FloatArray | float | None[source]
__and__(other)[source]
__or__(other)[source]
class SimpleGlobalOptimisation(sampler, num_search_chains, model, num_parallel_workers=1)[source]

Simple global optimisation leveraging building blocks of nested sampling.

Parameters: