This is a read only copy of the old FEniCS QA forum. Please visit the new QA forum to ask questions

Parallel solve fails

0 votes

Hi,

I have the following simple mixed DG FEM problem:

from dolfin import *

# Create mesh and define function space
degree = 0
delta_p = 2
N = 20
mesh = UnitSquareMesh(N,N)

u0 = Constant(0.0)

# Trial spaces
U = FunctionSpace(mesh, "DG", degree)                         # solution
S = VectorFunctionSpace(mesh, "DG", degree)                   # gradient
Uh= FunctionSpace(mesh, "CG", degree+1, restriction="facet")  # traces
Sh= FunctionSpace(mesh, "RT", degree+1, restriction="facet")    # fluxes
# Test spaces
V = FunctionSpace(mesh, "DG", degree+delta_p)
T = VectorFunctionSpace(mesh, "DG", degree+delta_p)
#
E = MixedFunctionSpace([U,S,Uh,Sh,V,T])
#

(u,sig,uhat,sighat,v,tau)       = TrialFunctions(E)
(du,dsig,duhat,dsighat,dv,dtau) = TestFunctions(E)
n = FacetNormal(mesh)

# bilinear forms
# inner product in the test space
def iprod(v1,tau1,v2,tau2):
    return inner(grad(v1),grad(v2))*dx + inner(div(tau1),div(tau2))*dx + inner(v1,v2)*dx + inner(tau1,tau2)*dx
# linear differential operator
def b(u,sigma,uhat,sighat,v,tau):
    return inner(sigma,grad(v))*dx + inner(sigma,tau)*dx + inner(u,div(tau))*dx - inner(uhat('+'),dot(tau('+'),n('+'))+dot(tau('-'),n('-')))*dS - inner(uhat,dot(\
tau,n))*ds - inner(dot(sighat('+'),n('+')),jump(v))*dS - inner(dot(sighat,n),v)*ds

a = iprod(v,tau,dv,dtau) + b(u,sig,uhat,sighat,dv,dtau) + b(du,dsig,duhat,dsighat,v,tau)

# linear functional
L = inner(Constant(1.0),dv)*dx

bcs = []
bcs.append(DirichletBC(E.sub(2), u0, DomainBoundary())) # boundary conditions on uhat

uSol = Function(E)
solve(a==L, uSol, bcs, solver_parameters={"linear_solver": "tfqmr", "preconditioner": "hypre_parasails"})
#solve(a==L, uSol, bcs)
(u,sigma,uhat,sighat,v,tau) = uSol.split()

u_file = File("u.pvd",'compressed')
u_file << u

It runs fine on one CPU, but fails when running in parallel with a direct solver and exits silently/successfully with an iterative solver but with a wrong/zero solution. Am I doing something wrong or is integration on edges/faces not supported in parallel?

Here is what happens with a direct solver:

mpirun -np 2 python poisson_dpg.py
Number of global vertices: 441
Number of global cells: 800
Solving linear variational problem.
Solving linear variational problem.
Traceback (most recent call last):
  File "poisson_dpg.py", line 47, in <module>
    solve(a==L, uSol, bcs)
  File "/usr/lib/python2.7/dist-packages/dolfin/fem/solving.py", line 269, in solve
    _solve_varproblem(*args, **kwargs)
  File "/usr/lib/python2.7/dist-packages/dolfin/fem/solving.py", line 298, in _solve_varproblem
    solver.solve()
RuntimeError:

*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at
***
***     fenics@fenicsproject.org
***
*** Remember to include the error message listed below and, if possible,
*** include a *minimal* running example to reproduce the error.
***
*** -------------------------------------------------------------------------
*** Error:   Unable to successfully call PETSc function 'KSPSolve'.
*** Reason:  PETSc error code is: 76.
*** Where:   This error was encountered inside /build/buildd/dolfin-1.5.0+dfsg/dolfin/la/PETScLUSolver.cpp.
*** Process: unknown
***
*** DOLFIN version: 1.5.0
*** Git changeset:  unknown
*** -------------------------------------------------------------------------

Traceback (most recent call last):
  File "poisson_dpg.py", line 47, in <module>
    solve(a==L, uSol, bcs)
  File "/usr/lib/python2.7/dist-packages/dolfin/fem/solving.py", line 269, in solve
    _solve_varproblem(*args, **kwargs)
  File "/usr/lib/python2.7/dist-packages/dolfin/fem/solving.py", line 298, in _solve_varproblem
    solver.solve()
RuntimeError:

*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at
***
***     fenics@fenicsproject.org
***
*** Remember to include the error message listed below and, if possible,
*** include a *minimal* running example to reproduce the error.
***
*** -------------------------------------------------------------------------
*** Error:   Unable to successfully call PETSc function 'KSPSolve'.
*** Reason:  PETSc error code is: 76.
*** Where:   This error was encountered inside /build/buildd/dolfin-1.5.0+dfsg/dolfin/la/PETScLUSolver.cpp.
*** Process: unknown
***
*** DOLFIN version: 1.5.0
*** Git changeset:  unknown
*** -------------------------------------------------------------------------

--------------------------------------------------------------------------
mpirun noticed that the job aborted, but has no info as to the process
that caused that situation.
--------------------------------------------------------------------------
asked May 14, 2015 by ae FEniCS Novice (290 points)

1 Answer

0 votes
 
Best answer

You need to set the parameter 'ghost_mode' to 'shared_facets', I think

answered May 16, 2015 by chris_richardson FEniCS Expert (31,740 points)
selected Apr 14, 2017 by ae

Great, this indeed allows me to solve the problem using a direct solver.

...