Dear all,
I am moving my simple linear elastic program to parallel, and next, distributed environments. Right now, I just like to make threads and MPI work, recreating Figure 10.8 of the book (page 216), where I want to create a field based on the thread number or MPI rank.
First problem: I don't know how to retrieve the thread number. All I did is use parameters["num_threads"] = 6
, but I cannot be sure it's working and spawning threads.
Second problem: MPI won't work on my system (MacOS X 10.9.4, Fenics application, not from sources). If I use parameters["mesh_partitioner"] = "SCOTCH"
, and solve(A, u.vector(), b, 'petsc')
, I expected some error, but not a crash, as you can see below.
If I could make MPI and threads work I'd be very happy! The book isn't really helpful to me (I found only chapter 10.4.1, "Parallel computing"), so any hint would be really appreciated!
Thanks!
$ mpirun -np 4 python linelastic.py
Number of global vertices: 1028
Number of global cells: 1929
*** -------------------------------------------------------------------------
*** Warning: Parameters *_domains of assemble has been deprecated in DOLFIN version 1.4.0.
*** It will be removed from version 1.6.0.
*** Parameter *_domains of assemble will be removed. Include this in the ufl form instead.
*** -------------------------------------------------------------------------
I am rank *** -------------------------------------------------------------------------
*** Warning: MPI::process_number has been deprecated in DOLFIN version 1.4.
*** It will be removed from version 1.5.
*** MPI::process_number() has been replaced by MPI::rank(MPI_Comm).
I am rank *** -------------------------------------------------------------------------
*** Warning: MPI::process_number has been deprecated in DOLFIN version 1.4.
*** It will be removed from version 1.5.
*** MPI::process_number() has been replaced by MPI::rank(MPI_Comm).
I am rank *** -------------------------------------------------------------------------
I am rank *** -------------------------------------------------------------------------
*** Warning: MPI::process_number has been deprecated in DOLFIN version 1.4.
*** It will be removed from version 1.5.
*** MPI::process_number() has been replaced by MPI::rank(MPI_Comm).
*** -------------------------------------------------------------------------
*** -------------------------------------------------------------------------
0
2
*** -------------------------------------------------------------------------
3
*** Warning: MPI::process_number has been deprecated in DOLFIN version 1.4.
*** It will be removed from version 1.5.
*** MPI::process_number() has been replaced by MPI::rank(MPI_Comm).
*** -------------------------------------------------------------------------
1
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: No support for this operation for this object type!
[0]PETSC ERROR: Matrix format mpiaij does not have a built-in PETSc LU!
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Unknown Name on a darwin13. named Senseis-MacBook-Pro.local by sensei Fri Jul 25 17:01:28 2014
[0]PETSC ERROR: Libraries linked from /Users/johannr/fenics-1.4.0/local/lib
[0]PETSC ERROR: Configure run at Tue Jun 3 13:32:33 2014
[0]PETSC ERROR: Configure options --prefix=/Users/johannr/fenics-1.4.0/local COPTFLAGS=-O2 --with-debugging=0 --with-clanguage=cxx --with-c-support=1 --with-blas-lapack-dir=/usr --with-umfpack=1 --with-umfpack-include=/Users/johannr/fenics-1.4.0/local/include/suitesparse --with-umfpack-lib="[/Users/johannr/fenics-1.4.0/local/lib/libumfpack.a,/Users/johannr/fenics-1.4.0/local/lib/libamd.a]" --with-spooles=1 --with-spooles-include=/Users/johannr/fenics-1.4.0/local/include --with-spooles-lib=/Users/johannr/fenics-1.4.0/local/lib/libspooles.a --with-ptscotch=1 --with-ptscotch-dir=/Users/johannr/fenics-1.4.0/local --with-ml=1 --with-ml-include=/Users/johannr/fenics-1.4.0/local/include/trilinos --with-ml-lib=/Users/johannr/fenics-1.4.0/local/lib/libml.dylib --with-hdf5=1 --with-hdf5-dir=/Users/johannr/fenics-1.4.0/local --with-x=0 -with-x11=0 --with-fortran=0 --with-shared-libraries=1 PETSC_DIR=/Users/johannr/fenics-1.4.0/fenics-superbuild/build-fenics/CMakeExternals/src/PETSc PETSC_ARCH=darwin13.2.0-cxx-opt
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: MatGetFactor() line 3876 in /Users/johannr/fenics-1.4.0/fenics-superbuild/build-fenics/CMakeExternals/src/PETSc/src/mat/interface/matrix.c
[0]PETSC ERROR: PCSetUp_LU() line 133 in /Users/johannr/fenics-1.4.0/fenics-superbuild/build-fenics/CMakeExternals/src/PETSc/src/ksp/pc/impls/factor/lu/lu.c
[0]PETSC ERROR: PCSetUp() line 832 in /Users/johannr/fenics-1.4.0/fenics-superbuild/build-fenics/CMakeExternals/src/PETSc/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: KSPSetUp() line 278 in /Users/johannr/fenics-1.4.0/fenics-superbuild/build-fenics/CMakeExternals/src/PETSc/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: KSPSolve() line 402 in /Users/johannr/fenics-1.4.0/fenics-superbuild/build-fenics/CMakeExternals/src/PETSc/src/ksp/ksp/interface/itfunc.c
Traceback (most recent call last):
File "linelastic.py", line 134, in <module>
solve(A, u.vector(), b, 'petsc')
File "/Applications/FEniCS.app/Contents/Resources/lib/python2.7/site-packages/dolfin/fem/solving.py", line 278, in solve
return cpp.la_solve(*args)
File "/Applications/FEniCS.app/Contents/Resources/lib/python2.7/site-packages/dolfin/cpp/la.py", line 4482, in la_solve
return _la.la_solve(*args)
RuntimeError:
*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at
***
*** fenics@fenicsproject.org
***
*** Remember to include the error message listed below and, if possible,
*** include a *minimal* running example to reproduce the error.
***
*** -------------------------------------------------------------------------
*** Error: Unable to successfully call PETSc function 'KSPSolve'.
*** Reason: PETSc error code is: 56.
*** Where: This error was encountered inside /Users/johannr/fenics-1.4.0/fenics-superbuild/build-fenics/CMakeExternals/src/DOLFIN/dolfin/la/PETScLUSolver.cpp.
*** Process: unknown
***
*** DOLFIN version: 1.4.0
*** Git changeset: 3b6582dfb45139c906c13b9ad57395632a2090f4
*** -------------------------------------------------------------------------
>>>> A LONG DUMP...
--------------------------------------------------------------------------
mpirun noticed that process rank 2 with PID 59263 on node Senseis-MacBook-Pro.local exited on signal 6 (Abort trap: 6).
--------------------------------------------------------------------------