This is a read only copy of the old FEniCS QA forum. Please visit the new QA forum to ask questions

Preconditioner speed: Running hypre_amg and ilu in parallel

0 votes

I am trying to solve a 2D dynamic plasticity Problem with Drucker Prager and von Mises Model. Initially, I was using the 'hypre amg' preconditioner for running the von Mises model. When I switched to Drucker Prager model, I found that the 'hypre_amg' is now working quite slow and that 'ilu' preconditioner works twice as fast as 'hypre amg' but gives an error when run in parallel with MPI. For vonMises model, 'hypre_amg' runs twice as fast as it does for Drucker Prager model. Can someone please give any suggestion on what can be done to increase the current efficiency.

The following are the relevant parameters for nonlinear solver.

  dolfin::NewtonSolver nonlinear_solver;
  nonlinear_solver.parameters["convergence_criterion"] = "residual";
  nonlinear_solver.parameters["linear_solver"]         = "gmres";
  nonlinear_solver.parameters["preconditioner"]        = "hypre_amg";

I use mpirun -np 4 demo_name . It works for hypre_amg but not for 'ilu'. The error is noted in comment below.

asked Jul 27, 2016 by Chaitanya_Raj_Goyal FEniCS User (4,150 points)
Time: 0
Time: 0
Time: 0
Time: 0
Time: 1e-09
Time: 1e-09
Time: 1e-09
Time: 1e-09
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: No support for this operation for this object type!
[0]PETSC ERROR: Matrix format mpiaij does not have a built-in PETSc ILU!
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Unknown Name on a linux-gnu-c-opt named Jarvis by crg Tue Jul 26 18:30:45 2016
[0]PETSC ERROR: Libraries linked from /build/buildd/petsc-3.4.2.dfsg1/linux-gnu-c-opt/lib
[0]PETSC ERROR: Configure run at Tue Dec 17 23:10:14 2013
[0]PETSC ERROR: Configure options --with-shared-libraries --with-debugging=0 --useThreads 0 --with-clanguage=C++ --with-c-support --with-fortran-interfaces=1 --with-mpi-dir=/usr/lib/openmpi --with-mpi-shared=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-blacs=1 --with-blacs-include=/usr/include --with-blacs-lib="[/usr/lib/libblacsCinit-openmpi.so,/usr/lib/libblacs-openmpi.so]" --with-scalapack=1 --with-scalapack-include=/usr/include --with-scalapack-lib=/usr/lib/libscalapack-openmpi.so --with-mumps=1 --with-mumps-include=/usr/include --with-mumps-lib="[/usr/lib/libdmumps.so,/usr/lib/libzmumps.so,/usr/lib/libsmumps.so,/usr/lib/libcmumps.so,/usr/lib/libmumps_common.so,/usr/lib/libpord.so]" --with-umfpack=1 --with-umfpack-include=/usr/include/suitesparse --with-umfpack-lib="[/usr/lib/libumfpack.so,/usr/lib/libamd.so]" --with-cholmod=1 --with-cholmod-include=/usr/include/suitesparse --with-cholmod-lib=/usr/lib/libcholmod.so --with-spooles=1 --with-spooles-include=/usr/include/spooles --with-spooles-lib=/usr/lib/libspooles.so --with-hypre=1 --with-hypre-dir=/usr --with-ptscotch=1 --with-ptscotch-include=/usr/include/scotch --with-ptscotch-lib="[/usr/lib/libptesmumps.so,/usr/lib/libptscotch.so,/usr/lib/libptscotcherr.so]" --with-fftw=1 --with-fftw-include=/usr/include --with-fftw-lib="[/usr/lib/x86_64-linux-gnu/libfftw3.so,/usr/lib/x86_64-linux-gnu/libfftw3_mpi.so]" --CXX_LINKER_FLAGS=-Wl,--no-as-needed
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: MatGetFactor() line 3966 in src/mat/interface/matrix.c
[0]PETSC ERROR: PCSetUp_ILU() line 202 in src/ksp/pc/impls/factor/ilu/ilu.c
[0]PETSC ERROR: PCSetUp() line 890 in src/ksp/pc/interface/precon.c
[0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: KSPSolve() line 399 in src/ksp/ksp/interface/itfunc.c
terminate called after throwing an instance of 'std::runtime_error'
  what():  

*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at
***
***     fenics@fenicsproject.org
***
*** Remember to include the error message listed below and, if possible,
*** include a *minimal* running example to reproduce the error.
***
*** -------------------------------------------------------------------------
*** Error:   Unable to successfully call PETSc function 'KSPSolve'.
*** Reason:  PETSc error code is: 56.
*** Where:   This error was encountered inside /build/dolfin-k_QrtL/dolfin-1.6.0/dolfin/la/PETScKrylovSolver.cpp.
*** Process: 0

User Tianyikillua suggested that:

"bjacobi" should be an extension of "ilu" in parallel cases. Try the "bjacobi" preconditioner (maybe no longer available by just invoking "bjacobi", see https://bitbucket.org/fenics-project/dolfin/pull-requests/236/reentry-of-jacobi-and-bjacobi-as-petsc/diff, but you can set other parameters, e.g. PETScOptions ).

  • I am looking into his suggestions. As he pointed out, indeed bjacobi does not exist anymore.

*** Error: Unable to solve linear system using Krylov iteration.
*** Reason: Unknown preconditioner "bjacobi". Use list_krylov_solver_preconditioners() to list available preconditioners().

...