# Likelihood

bilby likelihood objects are used in calculating the likelihood of the data for some specific set of parameters. In mathematical notation, the likelihood can be generically written as $$\mathcal{L}(d| \theta)$$. How this is coded up will depend on the problem, but bilby expects all likelihood objects to have a parameters attribute (a dictionary of key-value pairs) and a log_likelihood() method. In this page, we’ll discuss how to write your own Likelihood, and the standard likelihoods in bilby.

## The simplest likelihood

To start with let’s consider perhaps the simplest likelihood we could write down, namely a Gaussian likelihood for a set of data $$\vec{x}=[x_1, x_2, \ldots, x_N]$$. The likelihood for a single data point, given the mean $$\mu$$ and standard-deviation $$\sigma$$ is given by

$\mathcal{L}(x_i| \mu, \sigma) = \frac{1}{\sqrt{2\pi\sigma^2}}\mathrm{exp}\left( \frac{-(x_i - \mu)^2}{2\sigma^2}\right)$

Then, the likelihood for all $$N$$ data points is

$\mathcal{L}(\vec{x}| \mu, \sigma) = \prod_{i=1}^N \mathcal{L}(x_i| \mu, \sigma)$

In practise, we implement the log-likelihood to avoid numerical overflow errors. To code this up in bilby, we would write a class like this:

class SimpleGaussianLikelihood(bilby.Likelihood):
def __init__(self, data):
"""
A very simple Gaussian likelihood

Parameters
----------
data: array_like
The data to analyse
"""
super().__init__(parameters={'mu': None, 'sigma': None})
self.data = data
self.N = len(data)

def log_likelihood(self):
mu = self.parameters['mu']
sigma = self.parameters['sigma']
res = self.data - mu
return -0.5 * (np.sum((res / sigma)**2)
+ self.N*np.log(2*np.pi*sigma**2))


This demonstrates the two required features of a bilby Likelihood object:

1. It has a parameters attribute (a dictionary with keys for all the parameters, in this case, initialised to None)

2. It has a log_likelihood method which, when called returns the log likelihood for all the data.

You can find an example that uses this likelihood here.

Tip

Note that the example above subclasses the bilby.Likelihood base class, this simply provides a few in built functions. We recommend you also do this when writing your own likelihood.

## General likelihood for fitting a function $$y(x)$$ to some data with known noise

The previous example was rather simplistic, Let’s now consider that we have some dependent data $$\vec{y}=y_1, y_2, \ldots y_N$$ measured at $$\vec{x}=x_1, x_2, \ldots, x_N$$. We believe that the data is generated by additive Gaussian noise with a known variance $$\sigma^2$$ and a function $$y(x; \theta)$$ where $$\theta$$ are some unknown parameters; that is

$y_i = y(x_i; \theta) + n_i$

where $$n_i$$ is drawn from a normal distribution with zero mean and standard deviation $$\sigma$$. As such, $$y_i - y(x_i; \theta)$$ itself will have a likelihood

$\mathcal{L}(y_i; x_i, \theta) = \frac{1}{\sqrt{2\pi\sigma^{2}}} \mathrm{exp}\left(\frac{-(y_i - y(x_i; \theta))^2}{2\sigma^2}\right)$

As with the previous case, the likelihood for all the data is the product over the likelihood for each data point.

In bilby, we can code this up as a likelihood in the following way:

class GaussianLikelihoodKnownNoise(bilby.Likelihood):
def __init__(self, x, y, sigma, function):
"""
A general Gaussian likelihood - the parameters are inferred from the
arguments of function

Parameters
----------
x, y: array_like
The data to analyse
sigma: float
The standard deviation of the noise
function:
The python function to fit to the data. Note, this must take the
dependent variable as its first argument. The other arguments are
will require a prior and will be sampled over (unless a fixed
value is given).
"""
self.x = x
self.y = y
self.sigma = sigma
self.N = len(x)
self.function = function

# These lines of code infer the parameters from the provided function
super().__init__(parameters=dict())

def log_likelihood(self):
res = self.y - self.function(self.x, **self.parameters)
return -0.5 * (np.sum((res / self.sigma)**2)
+ self.N*np.log(2*np.pi*self.sigma**2))


This likelihood can be given any python function, the data (in the form of x and y) and the standard deviation of the noise. The parameters are inferred from the arguments to the function argument, for example if, when instantiating the likelihood you passed in the following function:

def f(x, a, b):
return x**2 + b


Then you would also need to provide priors for a and b. For this likelihood, the first argument to function is always assumed to be the dependent variable.

Note

Here we have explicitly defined the noise_log_likelihood method as the case when there is no signal (i.e., $$y(x; \theta)=0$$).

You can see an example of this likelihood in the linear regression example.

## General likelihood for fitting a function $$y(x)$$ to some data with unknown noise

In the last example, we considered only cases with known noise (e.g., a prespecified standard deviation. We now present a general function which can handle unknown noise (in which case you need to specify a prior for $$\sigma$$, or known noise (in which case you pass the known noise in when instantiating the likelihood:

class GaussianLikelihood(bilby.Likelihood):
def __init__(self, x, y, function, sigma=None):
"""
A general Gaussian likelihood for known or unknown noise - the model
parameters are inferred from the arguments of function

Parameters
----------
x, y: array_like
The data to analyse
function:
The python function to fit to the data. Note, this must take the
dependent variable as its first argument. The other arguments
will require a prior and will be sampled over (unless a fixed
value is given).
sigma: None, float, array_like
If None, the standard deviation of the noise is unknown and will be
estimated (note: this requires a prior to be given for sigma). If
not None, this defined the standard-deviation of the data points.
This can either be a single float, or an array with length equal
to that for x and y.
"""
self.x = x
self.y = y
self.N = len(x)
self.sigma = sigma
self.function = function

# These lines of code infer the parameters from the provided function
parameters = inspect.getfullargspec(function).args
del parameters
super().__init__(parameters=dict.fromkeys(parameters))
self.parameters = dict.fromkeys(parameters)

self.function_keys = self.parameters.keys()
if self.sigma is None:
self.parameters['sigma'] = None

def log_likelihood(self):
sigma = self.parameters.get('sigma', self.sigma)
model_parameters = {k: self.parameters[k] for k in self.function_keys}
res = self.y - self.function(self.x, **model_parameters)
return -0.5 * (np.sum((res / sigma)**2)
+ self.N*np.log(2*np.pi*sigma**2))


We provide this general-purpose class as part of bilby bilby.core.likelihood.GaussianLikleihood

An example using this likelihood can be found on this page.

## Common likelihood functions

As well as the Gaussian likelihood defined above, bilby provides the following common likelihood functions:

• bilby.core.likelihood.PoissonLikelihood

• bilby.core.likelihood.StudentTLikelihood

• bilby.core.likelihood.ExponentialLikelihood

## Empty likelihood for subclassing

We provide an empty parent class which can be subclassed for alternative use cases bilby.Likelihood