Greetings,
I'm trying to transition to parallel solution, I had previously used xml as the storage media for the mesh but to parallelise we need to use hdf5.
Following code elsewhere in the support forum I have now
# Import micro mesh
macromesh = Mesh()
hdf = HDF5File(mpi_comm_world(),'./meshes/channel_10k.h5', "r")
hdf.read(mesh, '/mesh', False)
macroboundaries = FacetFunction("size_t", macromesh)
hdf.read(boundaries, "/boundaries")
Looking at the h5 file with h5ls I have
/ Group
/boundaries Group
/boundaries/coordinates Dataset {5436, 2}
/boundaries/topology Dataset {15921, 2}
/boundaries/values Dataset {15921}
/mesh Group
/mesh/cell_indices Dataset {10486}
/mesh/coordinates Dataset {5436, 2}
/mesh/topology Dataset {10486, 3}
Running the code gives me
Traceback (most recent call last): File "upscaled_ch.py", line 354,
in
main() File "upscaled_ch.py", line 267, in main
hdf.read(mesh, "/mesh", False) TypeError: in method 'HDF5File_read', argument 2 of type 'dolfin::GenericVector &' Aborted
(core dumped)
And I cannot see what is causing the typeerror, is it a problem with the h5 file, or the arguments I'm passing (none are a vector, nor should be that I can see?)