Loading [MathJax]/extensions/TeX/AMSmath.js
LALInference 4.1.9.1-3a66518
All Data Structures Namespaces Files Functions Variables Typedefs Enumerations Enumerator Macros Modules Pages
lalinference.io.hdf5 Namespace Reference

Functions

def read_samples (filename, path=None, tablename=POSTERIOR_SAMPLES)
 Read an HDF5 sample chain file. More...
 
def write_samples (table, filename, metadata=None, **kwargs)
 Write an HDF5 sample chain file. More...
 
def update_metadata (metadata, level, attrs, strict_versions, collision='raise')
 Updates the sub-dictionary 'key' of 'metadata' with the values from 'attrs', while enforcing that existing values are equal to those with which the dict is updated. More...
 
def extract_metadata (filename, metadata, log_noise_evidences=[], log_max_likelihoods=[], nlive=[], dset_name=None, nest=False, strict_versions=True)
 Extract metadata from HDF5 sample chain file. More...
 

Function Documentation

◆ read_samples()

def lalinference.io.hdf5.read_samples (   filename,
  path = None,
  tablename = POSTERIOR_SAMPLES 
)

Read an HDF5 sample chain file.

Parameters

filename : str The path of the HDF5 file on the filesystem. path : str, optional The path of the dataset within the HDF5 file. tablename : str, optional The name of table to search for recursively within the HDF5 file. By default, search for 'posterior_samples'.

Returns

table : astropy.table.Table The sample chain as an Astropy table.

Test reading a file written using the Python API:

‍import os.path from tempfile import TemporaryDirectory table = Table([ ... Column(np.ones(10), name='foo', meta={'vary': FIXED}), ... Column(np.arange(10), name='bar', meta={'vary': LINEAR}), ... Column(np.arange(10) * np.pi, name='bat', meta={'vary': CIRCULAR}), ... Column(np.arange(10), name='baz', meta={'vary': OUTPUT}) ... ]) with TemporaryDirectory() as dir: ... filename = os.path.join(dir, 'test.hdf5') ... write_samples(table, filename, path='foo/bar/posterior_samples') ... len(read_samples(filename)) 10

Test reading a file that was written using the LAL HDF5 C API: table = read_samples('test.hdf5') table.colnames ['uvw', 'opq', 'lmn', 'ijk', 'def', 'abc', 'rst', 'ghi']

Definition at line 168 of file hdf5.py.

◆ write_samples()

def lalinference.io.hdf5.write_samples (   table,
  filename,
  metadata = None,
**  kwargs 
)

Write an HDF5 sample chain file.

Parameters

table : astropy.table.Table The sample chain as an Astropy table. filename : str The path of the HDF5 file on the filesystem. metadata: dict (optional) Dictionary of (path, value) pairs of metadata attributes to add to the output file kwargs: dict Any keyword arguments for astropy.table.Table.write.

Check that we catch columns that are supposed to be FIXED but are not:

‍table = Table([ ... Column(np.arange(10), name='foo', meta={'vary': FIXED}) ... ]) write_samples(table, 'bar.hdf5', 'bat/baz') # doctest: +ELLIPSIS Traceback (most recent call last): ... AssertionError: Arrays are not equal Column foo is a fixed column, but its values are not identical ...

And now try writing an arbitrary example to a temporary file. import os.path from tempfile import TemporaryDirectory table = Table([ ... Column(np.ones(10), name='foo', meta={'vary': FIXED}), ... Column(np.arange(10), name='bar', meta={'vary': LINEAR}), ... Column(np.arange(10) * np.pi, name='bat', meta={'vary': CIRCULAR}), ... Column(np.arange(10), name='baz', meta={'vary': OUTPUT}) ... ]) with TemporaryDirectory() as dir: ... write_samples( ... table, os.path.join(dir, 'test.hdf5'), path='bat/baz')

Definition at line 243 of file hdf5.py.

◆ update_metadata()

def lalinference.io.hdf5.update_metadata (   metadata,
  level,
  attrs,
  strict_versions,
  collision = 'raise' 
)

Updates the sub-dictionary 'key' of 'metadata' with the values from 'attrs', while enforcing that existing values are equal to those with which the dict is updated.

Definition at line 281 of file hdf5.py.

◆ extract_metadata()

def lalinference.io.hdf5.extract_metadata (   filename,
  metadata,
  log_noise_evidences = [],
  log_max_likelihoods = [],
  nlive = [],
  dset_name = None,
  nest = False,
  strict_versions = True 
)

Extract metadata from HDF5 sample chain file.

Parameters

filename : str The path of the HDF5 file on the filesystem. metadata : dict Dict into which to place metadata log_noise_evidences : array (optional) Array into which to place log noise evidences (if nest = True) log_max_likelihoods : array (optional) Array into which to place log max likelihoods (if nest = True) nlive : array (optional) Array into which to place number of live points (if nest = True) return_run_identifier : Boolean (optional : default False) Whether to return the run identifier nest : Boolean (optional : default False) Whether to output quantities that only exist for nest runs

Returns

run_identifier : str The run identifier

Definition at line 332 of file hdf5.py.