This is a read only copy of the old FEniCS QA forum. Please visit the new QA forum to ask questions

hdf file read/write for time series

0 votes

Hello,
I've managed to read/write functions using the HDF format, but I'm unsure how to read data in time series format.

My code is basically the following:

writing (MPI)

  t = 0.0
  T=2
  file << (u,t)
  Hdf=HDF5File(mesh.mpi_comm(), hdfFile, "w")
  Hdf.write(mesh, "mesh")

  i=0
  while (t < T):
      i+=1
      t += dt
      u0.vector()[:] = u.vector()
      solver.solve(problem, u.vector())
      file << (u,t)
      r = u.vector()

      Hdf.write(u, "u",i)

  Hdf.close()

reading (single proc)

  f = HDF5File(mpi_comm_world(),hdfFile,'r')
  mesh = Mesh()
  f.read(mesh,"mesh",False)
  V = FunctionSpace(mesh,"CG",1)
  u = Function(V)
  f.read(u,'u',1.)
  f.close()

In the 'reading' routine, I can only call f.read as f.read(u,'u'), so it appears I can grab only one time slice. Is there a different syntax I should use for grabbing other iterations?

Thanks in advance and I apologize if I missed this in the documentation somewhere.

Pete

asked Mar 3, 2015 by huskeypm FEniCS Novice (330 points)

1 Answer

+1 vote

I guess it's undocumented, but you should be able to read a Function from e.g. in your case,

f.read(u, 'u/vector_2')

I think. if you look at the structure of the .h5 file (eg with h5ls) you can read a Function from one of its Vectors

answered Mar 3, 2015 by chris_richardson FEniCS Expert (31,740 points)
edited Mar 4, 2015 by chris_richardson

OK, I understand what you mean now. The following code seems to work AFAIK:

  i=0
  while (t < T):
      i+=1
      t += dt
      u0.vector()[:] = u.vector()
      solver.solve(problem, u.vector())
      file << (u,t)
      r = u.vector()

      Hdf.write(u, "u/Vector/vector_%d"%i)

  Hdf.close()

def ReadIt():
  f = HDF5File(mpi_comm_world(),hdfFile,'r')
  mesh = Mesh()
  f.read(mesh,"mesh",False)
  V = FunctionSpace(mesh,"CG",1)
  u = Function(V)
  for i in range(2):
    name = "u/Vector/vector_%d"%(i+1)
    f.read(u,name) 
  f.close()

I'll look into h5ls more, since a cursory look didn't unveil too much:

pmke226@kafka:~/scripts/dolfin$ h5ls a.h5 
mesh                     Group
u                        Group

Thanks again
Pete

Try h5ls -r

Here is a complete example, which should work. What you have above may work, but is probably a bit redundant...

from dolfin import *

mesh = UnitSquareMesh(20,20)
Q = FunctionSpace(mesh, "CG", 1)
F = Function(Q)

hdf = HDF5File(mesh.mpi_comm(), "a.h5", "w")

for i in range(10):
    hdf.write(F, "fun", i)

# Delete HDF5File object, closing file
del hdf


# Read functions back in

hdf = HDF5File(mesh.mpi_comm(), "a.h5", "r")
attr = hdf.attributes("fun")
nsteps = attr['count']
for i in range(nsteps):
    dataset = "fun/vector_%d"%i
    attr = hdf.attributes(dataset)
    print 'Retrieving time step:', attr['timestamp']
    hdf.read(F, dataset)

Great - I like this better!
Pete

One further related question the hdf.write function indicates it will write floats, but I can't seem to get the syntax correct. How do you recommend writing double arrays?

In [17]: help(hdf.write)
....
* write\ (values, name)

  Write simple vector of double to file

.....
vals= np.array([1.],dtype=np.float)
hdf.write(vals,"att2")

HDF5-DIAG: Error detected in HDF5 (1.8.12) MPI-process 0:
#000: ../../../src/H5D.c line 170 in H5Dcreate2(): unable to create dataset
major: Dataset
minor: Unable to initialize object
#001: ../../../src/H5Dint.c line 439 in H5Dcreate_named(): unable to create and link to dataset
major: Dataset
minor: Unable to initialize object
#002: ../../../src/H5L.c line 1638 in H5L_link_object(): unable to create new link to object
major: Links
minor: Unable to initialize object
#003: ../../../src/H5L.c line 1882 in H5L_create_real(): can't insert link
major: Symbol table
minor: Unable to insert object
#004: ../../../src/H5Gtraverse.c line 861 in H5G_traverse(): internal path traversal failed
major: Symbol table
minor: Object not found
#005: ../../../src/H5Gtraverse.c line 641 in H5G_traverse_real(): traversal operator failed
major: Symbol table
minor: Callback failed
#006: ../../../src/H5L.c line 1674 in H5L_link_cb(): name already exists
major: Symbol table
minor: Object already exists
HDF5-DIAG: Error detected in HDF5 (1.8.12) MPI-process 0:
#000: ../../../src/H5D.c line 437 in H5Dget_space(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.12) MPI-process 0:
#000: ../../../src/H5Shyper.c line 6593 in H5Sselect_hyperslab(): not a data space
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.12) MPI-process 0:
#000: ../../../src/H5Dio.c line 231 in H5Dwrite(): can't prepare for writing data
major: Dataset
minor: Write failed
#001: ../../../src/H5Dio.c line 263 in H5D
pre_write(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.12) MPI-process 0:
#000: ../../../src/H5D.c line 391 in H5Dclose(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.12) MPI-process 0:
#000: ../../../src/H5S.c line 405 in H5Sclose(): not a dataspace
major: Invalid arguments to routine
minor: Inappropriate type
Out[24]: array([ 1.])

Probably an API bug, try using "/att2" with leading slash. If you can register issue on bitbucket, it will be fixed...

/att2 was a no go as well. Hopefully I posted the issue correctly: https://bitbucket.org/fenics-project/dolfin/issue/486/hdf5-diag-error-detected-in-hdf5-1812-mpi

Thanks
Pete

...