wiki:InstallPetscAndMpiForProductionBuild

PETSc installation for (normal and) production build

April 2017 Instructions

First get an up to date copy of the Intel compiler, as well as the Intel Fortran compiler (ifort) for installing hypre.

Changes from previous instructions:

  • --with-clanguage=cxx does not seem to be necessary, and the PETSc installation guide recommends against it
  • It is possible to specify which versions of dependencies to install, which is necessary for MPICH and PETSc 3.6.2 as the default MPICH does not seem to play well with the Intel mkl.
# Needed for the intel compiler
source /opt/intel/bin/compilervars.sh intel64
export INTEL_LICENSE_FILE="28519@flexlm.nsms.ox.ac.uk"

wget http://ftp.mcs.anl.gov/pub/petsc/release-snapshots/petsc-lite-3.6.2.tar.gz
tar -zxf petsc-lite-3.6.2.tar.gz

MPICH=http://www.mpich.org/static/downloads/3.2/mpich-3.2.tar.gz

cd ~/petsc-3.6.2/
export PETSC_DIR=`pwd`

PETSc with support for GCC builds

export PETSC_ARCH=linux-gnu
./configure --with-make-np=10 --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-f2cblaslapack=1 --download-mpich=$MPICH --download-hdf5=1 --download-parmetis=1 --download-metis=1 --download-hypre=1 --with-shared-libraries
make all test

export PETSC_ARCH=linux-gnu-opt
./configure --with-make-np=10 --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-f2cblaslapack=1 --download-mpich=$MPICH --download-hdf5=1 --download-parmetis=1 --download-metis=1 --download-hypre=1 --with-shared-libraries --with-debugging=0
make all test

export PETSC_ARCH=linux-gnu-profile
./configure --with-make-np=10 --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-f2cblaslapack=1 --download-mpich=$MPICH --download-hdf5=1 --download-parmetis=1 --download-metis=1 --download-hypre=1 --with-shared-libraries --CFLAGS="-fno-omit-frame-pointer -pg" -CXXFLAGS="-fno-omit-frame-pointer -pg" -LDFLAGS=-pg
make all test

PETSc with support for Intel build

export PETSC_ARCH=linux-intel
./configure --with-make-np=10 --with-cc=icc --with-cxx=icpc --with-fc=ifort --download-f2cblaslapack=1 --download-mpich=$MPICH --download-hdf5=1 --download-parmetis=1 --download-metis=1 --download-hypre=1 --with-shared-libraries
make all test

PETSc with support for IntelProduction build

export PETSC_ARCH=linux-intel-opt-mkl
./configure --with-make-np=10 --with-cc=icc --with-cxx=icpc --with-fc=ifort --download-mpich=$MPICH --download-hdf5=1 --download-parmetis=1 --download-metis=1 --download-hypre=1 --with-shared-libraries --with-debugging=0 --with-blas-lapack-dir=/opt/intel/composerxe/mkl/lib/intel64/
make all test

Older Instructions

First get an up to date copy of the Intel compiler.

PETSc now downloads a decent version of mpi during installation, so no need to do that separately.

wget http://ftp.mcs.anl.gov/pub/petsc/release-snapshots/petsc-3.1-p8.tar.gz
tar -zxf petsc-3.1-p8.tar.gz
cd petsc-3.1-p8
export PETSC_DIR=`pwd`

PETSc with support for GCC builds

export PETSC_ARCH=linux-gnu
./config/configure.py --with-x=false --with-clanguage=cxx --download-openmpi=yes --download-hdf5=yes --download-parmetis=yes --download-c-blas-lapack=1 --with-shared --with-fortran=0
make all

export PETSC_ARCH=linux-gnu-opt
./config/configure.py --with-x=false --with-clanguage=cxx --download-openmpi=yes --download-hdf5=yes --download-parmetis=yes --download-c-blas-lapack=1 --with-shared --with-fortran=0 --with-debugging=0
make all

export PETSC_ARCH=linux-gnu-profile 
./config/configure.py --with-x=false --with-clanguage=cxx --download-openmpi=yes --download-hdf5=yes --download-parmetis=yes --download-c-blas-lapack=1 --with-shared --with-fortran=0  --CXXFLAGS=-pg --CFLAGS=-pg -LDFLAGS=-pg
make all

PETSc with support for Intel build

export PETSC_ARCH=linux-intel
./config/configure.py --with-cxx=icpc --with-cc=icc --with-vendor-compiler=intel --with-x=false --with-clanguage=cxx --download-openmpi=yes --CXXFLAGS=-fPIC --CFLAGS=-fPIC --download-hdf5=yes --download-parmetis=yes --with-shared --with-fortran=0 --download-c-blas-lapack=1
make all

PETSc with support for IntelProduction build

export PETSC_ARCH=linux-intel-opt-mkl
./config/configure.py --with-cxx=icpc --with-cc=icc --with-vendor-compiler=intel --with-x=false --with-clanguage=cxx --download-openmpi=yes --CXXFLAGS=-fPIC --CFLAGS=-fPIC --download-hdf5=yes --download-parmetis=yes --with-shared --with-fortran=0  --with-debugging=0 --with-blas-lapack-dir=/opt/intel/composerxe/mkl/lib/intel64 --with-static-mpi=1
make all

That is the end of the new instructions.

Experimental support for HYPRE library in Petsc 2.3

We tell PETSc to download and install HYPRE for us:

./config/configure.py --with-cxx=icpc --with-cc=icc --with-vendor-compiler=intel --with-mpi-dir=${HOME}/mpi --with-x=false  --with-debugging=0 -PETSC_ARCH=linux-intel-opt-mkl --with-clanguage=cxx --with-blas-lapack-dir=/opt/intel/mkl/9.1.023/lib/em64t/ --with-fortran=0 --download-hypre=yes

Note: you should adjust --with-blas-lapack-dir to match your system configuration and -PETSC_ARCH to the build name)

There's a compatibility issue between PETSc, icc and HYPRE. See the STRCIMP section of the following wiki: http://www.csafe.utah.edu/wiki/index.php/Information/Thirdparty/PETSc.

On some machines with some compilers (eg, ICC), PETSc configure believes that stricmp exists, and thus compiles against it, but when we try to link with the PETSc libs, we get an undefined symbol error. 

This can be fixed by running the configure, then manually editing bmake/linux-intel-opt-mkl/petscconf.h and commenting out (Use C style /* */ comments) the #define PETSC_HAVE_STRICMP 1.

Although PETSc's configuration script downloads and compiles HYPRE, the generated object code assumes BLAS symbols to end with two underscores. Unfortunately, MKL generates them with one and no underscores. We need to rebuild HYPRE to match this configuration.

Edit the file externalpackages/hypre-1.11.1b/src/utilities/fortran.h and look for the following block of text:

#elif defined(HYPRE_LINUX)

#  define hypre_NAME_C_CALLING_FORT(name,NAME) name##__
#  define hypre_NAME_FORT_CALLING_C(name,NAME) name##__

strings name##__ should be replaced by name##_ (only one underscore).

We need to rebuild HYPRE now. This is done during PETSc configuration. We have to force PETSc to recompile it although it just did it.

rm -rf externalpackages/hypre-1.11.1b/linux-intel-opt-mkl/
./config/configure.py --with-cxx=icpc --with-cc=icc --with-vendor-compiler=intel --with-mpi-dir=${HOME}/mpi --with-x=false  --with-debugging=0 -PETSC_ARCH=linux-intel-opt-mkl --with-clanguage=cxx --with-blas-lapack-dir=/opt/intel/mkl/9.1.023/lib/em64t/ --with-fortran=0 --download-hypre=yes

Finally, set PETSC_ARCH variable and compile PETSc

PETSC_ARCH=linux-intel-opt-mkl; export PETSC_ARCH
make all test

OLD INSTRUCTIONS

Note that the production build uses

  • Intel icpc as the compiler
  • Intel Math Kernel Libraries (MKL) for Blas and Lapack (see InstallMkl)
  • PETSc compiled with icc and MKL
  • (Possibly) MPI compiled with icc. If not, the instructions below with fool the PETSc configuration into running mpicc -cc=icc.

Optional

Download MPICH from ftp://ftp.mcs.anl.gov/pub/mpi/mpich.tar.gz to $HOME

cd 
tar -xzvf mpich.tar.gz
cd mpich-1.2.7p1
#The following line shows how you built the regular mpi (for comparison)
#./configure --prefix=${HOME}/mpi --with-comm=shared --with-device=ch_shmem --enable-sharedlib --disable-f77
./configure --prefix=${HOME}/mpi-intel -cc=icc -c++=icpc --with-comm=shared --with-device=ch_shmem --enable-sharedlib --disable-f77
make
cd examples/test/
make testing
cd ../..
make install
cd
rm -rf mpich-1.2.7p1/  mpich.tar.gz 

(Optional, since you probably have the install in place)

Download Petsc from ftp://ftp.mcs.anl.gov/pub/petsc/release-snapshots/petsc-2.3.2-p4.tar.gz to $HOME

tar -xzvf petsc-2.3.2-p4.tar.gz 
rm petsc-2.3.2-p4.tar.gz 

Start here if you're using the existing MPI and PETSc

cd petsc-2.3.2-p4/
export PETSC_DIR=`pwd`
#(Optional) If you rebuilt a new MPI use the following (and make sure that your PATH is set correctly in .bashrc to point to the new MPI installation)
#./config/configure.py --with-vendor-compiler=intel --with-mpi-dir=${HOME}/mpi-intel --with-x=false  --with-debugging=0 -PETSC_ARCH=linux-intel-opt-mkl --with-clanguage=cxx --with-blas-lapack-dir=/opt/intel/mkl/9.1.023/lib/32/ --with-fortran=0
./config/configure.py --with-cxx=icpc --with-cc=icc --with-vendor-compiler=intel --with-mpi-dir=${HOME}/mpi --with-x=false  --with-debugging=0 -PETSC_ARCH=linux-intel-opt-mkl --with-clanguage=cxx --with-blas-lapack-dir=/opt/intel/mkl/9.1.023/lib/32/ --with-fortran=0
make all

#On 64-bit
./config/configure.py --with-cxx=icpc --with-cc=icc --with-vendor-compiler=intel --with-mpi-dir=${HOME}/mpi --with-x=false  --with-debugging=0 -PETSC_ARCH=linux-intel-opt-mkl --with-clanguage=cxx --with-blas-lapack-dir=/opt/intel/mkl/9.1.023/lib/em64t/ --with-fortran=0
#Read the STRICMP comment below!
make all test

As above, if you have add an Intel build of MPI add the correct MPI programs to your path, by adding the following to ~/.bashrc:

# Add MPI programs to path
export PATH=$HOME/mpi-intel/bin:$PATH
Last modified 4 years ago Last modified on Apr 3, 2017, 10:53:10 AM