wiki:InstallingHypreAndOrMumpsWithPetscThree

This page documents how to install Petsc 3 with HYPRE (an algebraic multigrid preconditioner), or with HYPRE and MUMPS (a direct solver for linear systems). It consolidates information from #756 and #1258

Installing Petsc3 with HYPRE

Download and unpack the Petsc tarball.

The debug build:

cd petsc-3.0.0-p8/     ## alter as appropriate
export PETSC_DIR=`pwd`
export PETSC_ARCH=linux-gnu
./config/configure.py --with-gnu-compilers==true --download-f-blas-lapack=1 \
                      --with-x=false --with-clanguage=cxx --download-hypre=yes \
                      --download-openmpi=yes --CXXFLAGS=-fPIC --CFLAGS=-fPIC \
                      --download-hdf5=yes
make all test

The optimised build:

export PETSC_DIR=`pwd`
export PETSC_ARCH=linux-gnu-opt
./config/configure.py --with-gnu-compilers==true --download-f-blas-lapack=1 \
                      --with-x=false --with-clanguage=cxx --download-hypre=yes \
                      --download-openmpi=yes --CXXFLAGS=-fPIC --CFLAGS=-fPIC \
                      --with-debugging=0 --download-hdf5=ifneeded
make all test

The intel-production build (this is for userpc60.comlab, so alter the configure arguments as appropriate):

export PETSC_DIR=`pwd`
export PETSC_ARCH=linux-intel-opt-mkl
./config/configure.py --with-cxx="icpc -gcc-version=410 -I /usr/include/c++/4.1.3/x86_64-linux-gnu/ -I/usr/include/c++/4.1.3/ -I/usr/include/c++/4.1.3/backward" \
                      --with-cc=icc --with-vendor-compiler=intel \
                      --with-blas-lapack-dir=/opt/intel/mkl/9.1.023/lib/em64t/ \
                      --with-fortran=1 --with-x=false --with-clanguage=cxx \
                      --download-hypre=yes --download-openmpi=yes \
                      --CXXFLAGS=-fPIC --CFLAGS=-fPIC \
                      --with-debugging=0 --download-hdf5=ifneeded
make all test

Linking

The appropriate configuration file in the Chaste folder, in <ChasteFolder>/python/hostconfig/, that is being used, needs the following changes

  1. Add the petsc 3 path, for example, after the Petsc 2 path(s)
    petsc_3_0_path = '../../../petsc-3.0.0-p8/'
    
  2. Change the following
    blas_lapack = ['f2clapack', 'f2cblas']
    other_libraries = ['boost_serialization', 'xerces-c', 'hdf5', 'z', 'metis']
    
    to
    blas_lapack = ['flapack', 'fblas']
    other_libraries = ['boost_serialization', 'xerces-c', 'hdf5', 'z', 'metis', 'libgfortran', 'HYPRE']
    

Installing Petsc3 with HYPRE and MUMPS

For each of the above configures and makes, add

--with-mumps=yes --download-mumps=yes --with-parmetis=yes \
--download-parmetis=yes --download-scalapack=yes \
--with-blacs=yes --download-blacs=yes  

to the configure options.

So overall

The debug build:

cd petsc-3.0.0-p8/     ## alter as appropriate
export PETSC_DIR=`pwd`
export PETSC_ARCH=linux-gnu
./config/configure.py --with-gnu-compilers==true --download-f-blas-lapack=1 \
                      --with-x=false --with-clanguage=cxx --download-hypre=yes \
                      --download-openmpi=yes --CXXFLAGS=-fPIC --CFLAGS=-fPIC \
                      --download-hdf5=yes \
                      --with-mumps=yes --download-mumps=yes --with-parmetis=yes \
                      --download-parmetis=yes --download-scalapack=yes \
                      --with-blacs=yes --download-blacs=yes  
make all test

The optimised build:

export PETSC_DIR=`pwd`
export PETSC_ARCH=linux-gnu-opt
./config/configure.py --with-gnu-compilers==true --download-f-blas-lapack=1 \
                      --with-x=false --with-clanguage=cxx --download-hypre=yes \
                      --download-openmpi=yes --CXXFLAGS=-fPIC --CFLAGS=-fPIC \
                      --with-debugging=0 --download-hdf5=ifneeded \
                      --with-mumps=yes --download-mumps=yes --with-parmetis=yes \
                      --download-parmetis=yes --download-scalapack=yes \
                      --with-blacs=yes --download-blacs=yes  
make all test

The intel-production build (for userpc60.comlab; alter the configure arguments as appropriate):

export PETSC_DIR=`pwd`
export PETSC_ARCH=linux-intel-opt-mkl
./config/configure.py --with-cxx="icpc -gcc-version=410 -I /usr/include/c++/4.1.3/x86_64-linux-gnu/ -I/usr/include/c++/4.1.3/ -I/usr/include/c++/4.1.3/backward" \
                      --with-cc=icc --with-vendor-compiler=intel \
                      --with-blas-lapack-dir=/opt/intel/mkl/9.1.023/lib/em64t/ \
                      --with-fortran=1 --with-x=false --with-clanguage=cxx \
                      --download-hypre=yes --download-openmpi=yes \
                      --CXXFLAGS=-fPIC --CFLAGS=-fPIC \
                      --with-debugging=0 --download-hdf5=ifneeded \
                      --with-mumps=yes --download-mumps=yes --with-parmetis=yes \
                      --download-parmetis=yes --download-scalapack=yes \
                      --with-blacs=yes --download-blacs=yes  
make all test

Linking

As well as the petsc_3_0_path and blas_lapack changes described above, change other_libraries to

other_libraries = ['boost_serialization', 'xerces-c', 'hdf5', 'z',
                   'libgfortran', 'HYPRE', 
                   'dmumps', 'mumps_common', 'scalapack', 'blacs', 'pord', 'mpi_f77',
                   'parmetis', 'metis'
                  ]

Notes: libgfortran may or may not be needed for MUMPS; the third line has the mumps dependencies (mpi_f77 was required on userpc60 to prevent linking failures due to MUMPS requiring Fortran name bindings (with trailing underscore) but mpicxx not including the Fortran MPI library); and parmetis (and probably metis) have to be at the end.

Last modified 10 years ago Last modified on Jun 1, 2010, 9:00:46 AM