I am trying to create a Domain Decomposition preconditioner that requires the solution of subdomain problems. In order to accomplish this I have made local copies of the mesh on each processor. To do this, I copy the local vertices and cells with the same ordering as the local parts of the global mesh, into a local mesh using MeshEditor().
I was under the impression that if the vertices for each cell are ordered the same between two meshes, that the edge ordering would also be identical based on the numbering conventions outlined in section 16.5 of the FEniCS book. This turns out to be not true.
Why do these turn out to be different? and how can I enforce the edges to have the same ordering
Here is a code which recreates the problem, the printout shows that the local and global cell-edge connectivities differ.
from dolfin import *
comm = mpi_comm_world()
mpirank = MPI.rank(comm)
mpisize = MPI.size(comm)
ny = 5
mesh = UnitSquareMesh(ny-1, ny-1)
meshtopo = mesh.topology()
cell_ghost_offset = meshtopo.ghost_offset(2)
vertex_ghost_offset = meshtopo.ghost_offset(0)
mesh_local = Mesh(mpi_comm_self())
mesh_editor = MeshEditor()
mesh_editor.open(mesh_local, 2, 2)
mesh_editor.init_vertices(vertex_ghost_offset)
mesh_editor.init_cells(cell_ghost_offset)
print 'global mesh connectivity'
used_verts = []
for c in cells(mesh):
cell_vs = []
cell_fs = []
for f in facets(c):
cell_fs.append(f.index())
print mpirank, c.index(), cell_fs
for v in vertices(c):
cell_vs.append(v.index())
if v.index() not in used_verts:
mesh_editor.add_vertex(v.index(), \
Vertex(mesh, v.index()).x(0), \
Vertex(mesh, v.index()).x(1))
used_verts.append(v.index())
mesh_editor.add_cell(c.index(), cell_vs[0], cell_vs[1], cell_vs[2])
mesh_editor.close()
comm.barrier()
print 'local mesh connectivity'
for c in cells(mesh_local):
cell_fs = []
for f in facets(c):
cell_fs.append(f.index())
print mpirank, c.index(), cell_fs