API

GPDiffEq.gp_negloglikelihoodMethod
Returns two function function. 1) to compute the loglikelihood and 2) rebuild the GP from a new set of parameters.
    gp_negloglikelihood(gp, x, y)

Arguments:

  • gp: An AbstractGP.FiniteGP
  • x: input data
  • y: output data
source
GPDiffEq.gp_trainFunction

Simplified copy of sciml_train

gp_train(loss, θ, opt=ADAGrad(0.5), adtype=Optimization.AutoZygote(), args...;
lower_bounds=nothing, upper_bounds=nothing, maxiters=1000, kwargs...,)
source

Solver API

GPDiffEq.PullSolversModule.GPODEProblemType

Defines a Gaussian Process differential equation problem.

Problem Type

In a GPODE problem, we assume that in the ODE

\[\dot{u}(t) = f(u(t), t, p)\]

the function $f$ is given by a Gaussian process.

ToDo: More details on GP interface and options (Regular, sparse, ...)

Fields

source
GPDiffEq.PullSolversModule.GPODESampledEnsembleProblemType
GPODESampledEnsembleProblem(prob::SampledGPODEProblem, nGPSamples, [nInValSamples]; kwargs...)

Defines a GPODESampledEnsembleProblem, which is a convenience wrapper around a SciML.EnsembleProblem (https://docs.sciml.ai/DiffEqDocs/stable/features/ensemble/) to compute the empirical distribution of the solution of a GPODEProblem.

Arguments:

  • prob: a SampledGPODEProblem
  • reduction: an AbstractReduction, which defines how to reduce the ensemble of solutions. Default is OnlineReduction(FitGenNormal()), which returns a (Mv)Normal distribution for each state.
  • nGPSamples: number of samples from the GPODEFunction
  • nInValSamples: number of samples from the initial value distribution. Only needed if u0 is a normal distribution (i.e. Normal or MvNormal).

Keyword arguments: The normal keyword arguments for SciML.EnsembleProblems are supported, except for prob_func and reduction.

source

Derivative GPs

AbstractGPs.posteriorMethod
posterior(fx::FiniteGP{<:PosteriorGP}, y::AbstractVector{<:Real})

Construct the posterior distribution over fx.f when f is itself a PosteriorGP by updating the Cholesky factorisation of the covariance matrix and avoiding recomputing it from the original covariance matrix. It does this by using update_chol functionality. Other aspects are similar to a regular posterior.

source
AbstractGPs.posteriorMethod
posterior(FiniteGP{<:DerivativeGP}, y::AbstractVector{<:Real})

The posterior of a derivative GP, conditioned on the data y from the output space of the undifferentiated GP. Evaluating this posterior at a point x will return the posterior of the derivative at x, and therefore not return the original data y.

source