Usage and Examples

Usage notes

The steps to analyse data with Parallel Bilby are:

  1. Ini Creation:

    Create an ini with paths to the prior, PSD and data files, along with other kwargs (see Data Generation for a list of all the required kwargs).

  2. Parallel Bilby Generation:

    Setup your Parallel Bilby jobs with

    $ parallel_bilby_generation <ini file>
    

    This generates

    • Plots of the PSD (review before submitting your job)

    • Slurm submission scripts

    • a data dump pickle (object packed with the PSD, data, etc)

  3. Parallel Bilby Analysis:

    To submit the Slurm jobs on a cluster, run

    $ bash outdir/submit/bash_<label>.sh
    

    Alternatively, to run locally without submitting a job, check the bash file for the required command. It should look something like:

    $ mpirun parallel_bilby_analysis outdir/data/<label>_data_dump.pickle --label <label> --outdir outdir/result --sampling-seed 1234`
    

Example ini files

Refer to the Parallel Bilby Examples Folder for example ini files along with Jupyter Notebooks explaining how to set up Parallel Bilby jobs.

The folder has three examples:

GW150914

To analyse GW150914 with Parallel Bilby you may use the following ini file. An explanation of the ini file’s contents are presented in the GW150914 tutorial.ipynb.

In this example we automate the analysis data download process. We also include the priors for the analysis inside the ini file.

GW170817

To analyse GW170817 with Parallel Bilby you may use the following ini file.

################################################################################
## Data generation arguments
################################################################################

trigger_time = 1187008882.43

################################################################################
## Detector arguments
################################################################################

detectors = [H1, L1, V1]
psd_dict = {H1=psd_data/h1_psd.txt, L1=psd_data/l1_psd.txt, V1=psd_data/v1_psd.txt}

# Download the data from https://www.gw-openscience.org/events/GW170817/ and place in raw_data/
data-dict = {H1:raw_data/H-H1_LOSC_CLN_4_V1-1187007040-2048.gwf, L1:raw_data/L-L1_LOSC_CLN_4_V1-1187007040-2048.gwf, V1:raw_data/V-V1_LOSC_CLN_4_V1-1187007040-2048.gwf}
channel_dict = {H1:LOSC-STRAIN, L1:LOSC-STRAIN, V1=LOSC-STRAIN}
duration = 128
# Temporary fix for the the prior duration issue
enforce_signal_duration = False

################################################################################
## Job submission arguments
################################################################################

label = GW170817
outdir = outdir

################################################################################
## Likelihood arguments
################################################################################

distance-marginalization=True
phase-marginalization=True
time-marginalization=True

################################################################################
## Prior arguments
################################################################################

prior-file = GW170817.prior

################################################################################
## Waveform arguments
################################################################################

waveform_approximant = IMRPhenomPv2_NRTidal
frequency-domain-source-model = lal_binary_neutron_star

###############################################################################
## Sampler settings
################################################################################

sampler = dynesty
nlive = 1000
nact = 5

################################################################################
## Slurm Settings
################################################################################
nodes = 10
ntasks-per-node = 16
time = 24:00:00
n-check-point = 10000

In this example we require the user to manually download the data for analysis.

The priors are contained in a separate prior file for this analysis:

chirp_mass = Uniform(name='chirp_mass', minimum=1.18, maximum=1.21)
mass_ratio = Uniform(name='mass_ratio', minimum=0.125, maximum=1)
mass_1 = Constraint(name='mass_1', minimum=1.001398, maximum=4.313897948277728)
mass_2 = Constraint(name='mass_2', minimum=1.001398, maximum=4.313897948277728)
a_1 = Uniform(name='a_1', minimum=0, maximum=0.05)
a_2 = Uniform(name='a_2', minimum=0, maximum=0.05)
tilt_1 = Sine(name='tilt_1')
tilt_2 = Sine(name='tilt_2')
phi_12 = Uniform(name='phi_12', minimum=0, maximum=2 * np.pi, boundary='periodic')
phi_jl = Uniform(name='phi_jl', minimum=0, maximum=2 * np.pi, boundary='periodic')
luminosity_distance = bilby.gw.prior.UniformComovingVolume(name='luminosity_distance', minimum=1, maximum=75)
dec = -0.408084
ra = 3.44616
cos_theta_jn = Uniform(name='cos_theta_jn', minimum=-1, maximum=1)
psi = Uniform(name='psi', minimum=0, maximum=np.pi, boundary='periodic')
phase = Uniform(name='phase', minimum=0, maximum=2 * np.pi, boundary='periodic')
lambda_1 = Uniform(name='lambda_1', minimum=0, maximum=5000)
lambda_2 = Uniform(name='lambda_2', minimum=0, maximum=5000)

Again, an explanation of the ini file’s contents are presented in the GW170817 tutorial.ipynb, along with commands needed to download the analysis data.

Multiple Injections

You may need to analyse multiple injections with Parallel Bilby. The Multiple Injections folder contains some code to help create submission files for each injection.