API
GPDiffEq.gp_negloglikelihood
— MethodReturns two function function. 1) to compute the loglikelihood and 2) rebuild the GP from a new set of parameters.
gp_negloglikelihood(gp, x, y)
Arguments:
gp
: An AbstractGP.FiniteGPx
: input datay
: output data
GPDiffEq.gp_train
— FunctionSimplified copy of sciml_train
gp_train(loss, θ, opt=ADAGrad(0.5), adtype=Optimization.AutoZygote(), args...;
lower_bounds=nothing, upper_bounds=nothing, maxiters=1000, kwargs...,)
Solver API
GPDiffEq.PullSolversModule.AbstractGPODEFunction
— Typeabstract type AbstractGPODEFunction{iip} <: SciMLBase.AbstractODEFunction{iip}
GPDiffEq.PullSolversModule.GPODEProblem
— TypeGPODEProblem(f::ODEFunction,u0,tspan,p=NullParameters(),callback=CallbackSet())
Define an GPODE problem from an ...
GPDiffEq.PullSolversModule.GPODEProblem
— TypeDefines a Gaussian Process differential equation problem.
Problem Type
In a GPODE problem, we assume that in the ODE
\[\dot{u}(t) = f(u(t), t, p)\]
the function $f$ is given by a Gaussian process.
ToDo: More details on GP interface and options (Regular, sparse, ...)
Fields
GPDiffEq.PullSolversModule.GPODESampledEnsembleProblem
— TypeGPODESampledEnsembleProblem(prob::SampledGPODEProblem, nGPSamples, [nInValSamples]; kwargs...)
Defines a GPODESampledEnsembleProblem, which is a convenience wrapper around a SciML.EnsembleProblem (https://docs.sciml.ai/DiffEqDocs/stable/features/ensemble/) to compute the empirical distribution of the solution of a GPODEProblem.
Arguments:
prob
: a SampledGPODEProblemreduction
: anAbstractReduction
, which defines how to reduce the ensemble of solutions. Default is OnlineReduction(FitGenNormal()), which returns a (Mv)Normal distribution for each state.nGPSamples
: number of samples from the GPODEFunctionnInValSamples
: number of samples from the initial value distribution. Only needed ifu0
is a normal distribution (i.e. Normal or MvNormal).
Keyword arguments: The normal keyword arguments for SciML.EnsembleProblems are supported, except for prob_func
and reduction
.
Derivative GPs
GPDiffEq.DerivativeGPModule.Derivative01Kernel
— TypeDerivative01Kernel(kernel)
Derivative of kernel
with respect to the second argument:
$D^{(0,1)} k(x,y) = \frac{\partial}{\partial y} k(x,y)$
Can be evaluated like a normal kernel.
GPDiffEq.DerivativeGPModule.Derivative10Kernel
— TypeDerivative10Kernel(kernel)
Derivative of kernel
with respect to the first argument:
$D^{(1,0)} k(x,y) = \frac{\partial}{\partial x} k(x,y)$
Can be evaluated like a normal kernel.
GPDiffEq.DerivativeGPModule.Derivative11Kernel
— TypeDerivative11Kernel(kernel)
Differentiate kernel
with respect to both arguments.
$D^{(1,1)} k(x,y) = \frac{\partial^2}{\partial x \partial y} k(x,y)$
Can be evaluated like a normal kernel.
GPDiffEq.DerivativeGPModule.DerivativeGP
— TypeDerivativeGP
The Gaussian Process (GP)
GPDiffEq.DerivativeGPModule.DerivativeKernelCollection
— TypeDerivativeKernelCollection(kernel)
AbstractGPs.posterior
— Methodposterior(fx::FiniteGP{<:PosteriorGP}, y::AbstractVector{<:Real})
Construct the posterior distribution over fx.f
when f
is itself a PosteriorGP
by updating the Cholesky factorisation of the covariance matrix and avoiding recomputing it from the original covariance matrix. It does this by using update_chol
functionality. Other aspects are similar to a regular posterior.
AbstractGPs.posterior
— Methodposterior(FiniteGP{<:DerivativeGP}, y::AbstractVector{<:Real})
The posterior of a derivative GP, conditioned on the data y
from the output space of the undifferentiated GP. Evaluating this posterior at a point x
will return the posterior of the derivative at x
, and therefore not return the original data y
.
GPDiffEq.DerivativeGPModule.differentiate
— Methoddifferentiate(gpp::PosteriorGP)