[Nexus] large overhead on small hdf5 files
Paul Kienzle
paul.kienzle at nist.gov
Wed Sep 24 00:05:44 BST 2014
Anyone have any hints for reducing HDF5 file sizes?
Using h5stat, I have 750k of metadata for 75k of raw data. The data is all small, but with 1300 data items in the file, the overhead adds up.
I’ve been using the h5py python HDF5 wrapper to create the files, but even when copying them using napi they are still huge.
Thanks in advance,
- Paul
Paul Kienzle
paul.kienzle at nist.gov
More information about the NeXus
mailing list