bundle.hdf5

Submodules

Classes

Store

Simple HDF5 store for reading and writing datasets and attributes.

Package Contents

class bundle.hdf5.Store(path: pathlib.Path | str, mode: str = 'r')

Simple HDF5 store for reading and writing datasets and attributes.

Usage:

with Store("data.h5", mode="w") as store:
    store.write_dataset("group/data", np.array([1, 2, 3]))
    store.write_attrs("group", {"version": "1.0"})

with Store("data.h5", mode="r") as store:
    arr = store.read_dataset("group/data")
    attrs = store.read_attrs("group")
path
mode = 'r'
property file: h5py.File
write_dataset(name: str, data: numpy.ndarray, **kwargs)

Write or overwrite a dataset at the given path.

read_dataset(name: str) numpy.ndarray

Read a dataset and return as numpy array.

write_attrs(path: str, attrs: dict | bundle.core.data.Data)

Write attributes to a group or dataset (creating a group if path doesn’t exist).

Accepts a plain dict or a bundle.core.data.Data instance (uses model_dump()). Only scalar and string values are stored; complex nested objects are JSON-serialised.

read_attrs(path: str) dict

Read all attributes from a group or dataset.

read_attrs_as(path: str, model: type[D]) D

Read attributes and construct a Data (or any Pydantic model) instance.

Values that were JSON-serialised on write are automatically deserialised.

list_datasets(group_path: str = '/') list[str]

List all dataset names under a group.

list_groups(group_path: str = '/') list[str]

List all group names under a group.

has(name: str) bool

Check if a dataset or group exists.