From RCSWiki
Jump to navigation Jump to search


The HDF5 technology suite is designed to organize, store, discover, access, analyze, share, and preserve diverse, complex data in continuously evolving heterogeneous computing and storage environments.

HDF5 supports all types of data stored digitally, regardless of origin or size. Petabytes of remote sensing data collected by satellites, terabytes of computational results from nuclear testing models, and megabytes of high-resolution MRI brain scans are stored in HDF5 files, together with metadata necessary for efficient data sharing, processing, visualization, and archiving.

Parallel HDF5

Unfortunately, parallel version of HDF5 is not supported on ARC, due to specifics of the hardware systems which are used to provide storage for the cluster.

Modules on ARC

To see HDF5 modules for installed versions of the library:

$ module avail lib/hdf5

---------------- /global/software/Modules/4.6.0/modulefiles -------------