Dear all,
i'm trying to set function values in parallel using the following code proposed as solution to this question (1) :
from dolfin import *
import numpy
from mpi4py import MPI as nMPI
mesh = UnitSquareMesh(5, 5, "crossed")
V = FunctionSpace(mesh, "CG", 1)
comm = nMPI.COMM_WORLD
mesh = UnitSquareMesh(10, 10)
V = FunctionSpace(mesh, "CG", 1)
u = Function(V)
v = u.vector()
dofmap = V.dofmap()
rank = comm.Get_rank()
ranks = []
visited = []
node_min, node_max = v.local_range()
for cell in cells(mesh):
nodes = dofmap.cell_dofs(cell.index())
for node in nodes:
if (node not in visited) and node_min <= node <= node_max:
visited.append(node)
ranks.append(float(rank))
ranks = numpy.array(ranks)
v.set_local(ranks)
The above code works fine using only one procces, but in contrast to the discussion presented in (1) when i run in parallel the next error message appears:
** Error: Unable to set local values of PETSc vector.
** Reason: Size of values array is not equal to local vector size.
** Where: This error was encountered inside PETScVector.cpp.
I'm using fenics 1.6. Any idea about this?
Thanks in advance!