This is a read only copy of the old FEniCS QA forum. Please visit the new QA forum to ask questions

Import a submesh in a MPI code (OK) and create the associated function space (ERROR)

0 votes

Hi everyone,
I have posted a few post recently about MPI code, here is a new one. This is definitely a hard step for beginner in parallel computation as i am.

Please consider my issue:

In a first py program (without mpi), i create a mesh and several submeshes:

#...    
# Create submesh
mesh_phase_A = SubMesh(mesh_Vol_TOT, subdomains_Vol_TOT, id_A)
# Save submesh
File(Folder_Mesh_fenics + 'DOMAIN_mesh_A.xml.gz') << mesh_phase_A

Then, in a second program (with MPI this time), i import ONE of these submesh:
(-> so i don't have to use submesh, which is not supported in parallel)

mpirun -np 4 python second_program.py

mesh_phase = Mesh(filepath)

If i plot it, i can see the mesh has been correctly partitioned between the different process

plot_title= 'MESH for the process: {value}'.format(value=rank_process)
plot(mesh_phase, title=plot_title, interactive=True)    

But then, when i create the function space

V = FunctionSpace(mesh_phase, 'Lagrange', 1)

I get this error from all process:

[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run 
[0]PETSC ERROR: to get more information on the crash.
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD 
with errorcode 59.

Where is the error? I believe it's because the imported mesh is a submesh (created in the 1st program).
How could i ask FEniCS to consider this submesh as a mesh?

Thanks for any help.

EDIT: when i run the second program with only one process (mpirun -np 1 *), everything works correctly.

asked Apr 17, 2017 by fussegli FEniCS Novice (700 points)
edited Apr 17, 2017 by fussegli

1 Answer

0 votes

I have implemented serial SubMesh generation and parallel import later on for mpirun without any problems a little while ago. Please try the following code snippet I extracted from my implementation, it works fine on 2 different unix platforms and hopefully, you find your error with help of this.

First part (serial Submesh generation):

from dolfin import *

mesh = UnitCubeMesh(10,10,10)
mesh.coordinates()[:] = (mesh.coordinates()[:] - 0.5) * 20000

class Dom1(SubDomain):
    def inside(self, x, on_boundary):
        return x[1] < 7000 + 1e-2

class B1(SubDomain):
    def inside(self, x, on_boundary):
        return on_boundary and x[0] > -1e4 and x[1] > -1e4 and x[2] > -1e4 \
            and x[0] < 1e4 and x[1] < 1e4 and x[2] < 1e4

dom1 = Dom1()
b1 = B1()

domains = CellFunction("size_t", mesh, 0)
dom1.mark(domains, 1)
submesh1 = SubMesh(mesh, domains, 1)
File('submesh1.xml') << submesh1

bm1 = BoundaryMesh(submesh1, 'exterior')
boundaries1 = CellFunction("size_t", bm1, 0)
b1 = AutoSubDomain(lambda x: (x[0] > -1e4 and x[1] > -1e4 and \
                   x[2] > -1e4 and x[0] < 1e4 and x[1] < 1e4 and x[2] < 1e4))
b1.mark(boundaries1, 1)
bmesh1 = SubMesh(bm1, boundaries1, 1)
File('bmesh1.xml') << bmesh1

import back with mpirun and build function spaces:

from dolfin import *

submesh1 = Mesh('submesh1.xml')
bmesh1 = Mesh('bmesh1.xml')
F1 = FunctionSpace(bmesh1, 'CG', 2)
F1s = FunctionSpace(submesh1, 'N1curl', 1)
ele = FiniteElement("Nedelec 1st kind H(curl)", submesh1.ufl_cell(), 1)
MixedSpace = FunctionSpace(submesh1, ele * ele)
answered Apr 21, 2017 by RR FEniCS User (3,330 points)

Thanks,
Your code example works well.
After debugging, i have finally found the error in my code come from the ghost_node parameter:

parameters["ghost_mode"] = "shared_facet" # Induces the segmentation error
# or 
parameters["ghost_mode"] = "shared_vertex" # Induces the segmentation error
# or
parameters["ghost_mode"] = "none" # OK

I don't really understand why...

...