bundle.hdf5¶
Thin wrapper around h5py for structured HDF5 file access, with native support for bundle.core.data.Data (Pydantic) models.
Public API¶
Class |
Module |
Description |
|---|---|---|
|
|
Context-managed HDF5 file handle with helpers for datasets, attributes, and groups. |
Usage¶
from bundle.hdf5 import Store
import numpy as np
# Write
with Store("data.h5", mode="w") as store:
store.write_dataset("experiment/values", np.array([1.0, 2.0, 3.0]))
store.write_attrs("experiment", {"version": "1.0", "machine": "host-abc"})
# Read
with Store("data.h5", mode="r") as store:
arr = store.read_dataset("experiment/values") # np.ndarray
meta = store.read_attrs("experiment") # dict
names = store.list_datasets("experiment") # ["values"]
store.has("experiment/values") # True
Data model support¶
write_attrs accepts dict or bundle.core.data.Data instances (uses model_dump()). Complex nested values are JSON-serialized automatically.
read_attrs_as reconstructs a Pydantic model from stored attributes:
from bundle.core.data import Data
from bundle.hdf5 import Store
class RunMeta(Data):
version: str
machine: str
with Store("data.h5", mode="w") as store:
store.write_attrs("run", RunMeta(version="1.0", machine="host"))
with Store("data.h5", mode="r") as store:
meta = store.read_attrs_as("run", RunMeta) # RunMeta(version='1.0', machine='host')
Store methods¶
Method |
Description |
|---|---|
|
Write or overwrite a dataset. |
|
Read a dataset as |
|
Write |
|
Read all attributes as |
|
Read attributes and reconstruct as a |
|
List dataset names under a group. |
|
List sub-group names under a group. |
|
Check if a dataset or group exists. |
Dependencies¶
h5pynumpy