-
Notifications
You must be signed in to change notification settings - Fork 3
Installing PEITS
For Ubuntu 16.x, the guide is the same, except use Zoltan 3.83 and use the PEITS branch upgrade_zoltan
1. Install required libraries.
sudo apt-get update
sudo apt-get install git gcc build-essential clang make autotools-dev autoconf automake libtool cmake scons pkg-config
sudo apt-get install libblas-dev liblapack-dev gfortran libboost-dev mpi-default-bin mpi-default-dev libgmp-dev libopenmpi-dev
2. Create a new folder — e.g. PEITS_root (Parallel EIT Solver) where the modules will be installed.
mkdir PEITS_root
cd PEITS_root
3. Download the PETSc library:
git clone -b maint https://bitbucket.org/petsc/petsc petsc
This will create a petsc subfolder in the PEITS_root directory.
4. The PETSc libary needs to be configured, from within the pestc folder. Installation has been tested using a previous version of PETSc (commit 8695de0 - the nearest release is v3.6.3), newer versions cause an error during configuration:
cd petsc
git checkout 8695de0
./configure --prefix=/home/username/PEITS_root/petscBUILD --with-x=0 --with-debugging=0 -CFLAGS="-O3 -DNDEBUG -ffast-math" --with-parmetis=1 --download-parmetis=yes --with-hypre=1 --download-hypre=yes --with-superlu_dist=1 --download-superlu_dist=yes --with-mumps=1 --download-mumps=yes --with-ml=1 --download-ml=yes --with-metis=1 --download-metis=yes --download-scalapack=yes --download-blacs=yes
Important Change the --prefix=/home/username/PEITS_root/petscBUILD option to reflect the actual location of the PEITS installation. e.g. --prefix=/home/tom/PEITS_root/petscBUILD. The petscBUILD folder will be created during the configuration process.
5. Build the PETSc source code.
make all test
make install
6. Download Zoltan library (v3.8 required) and extract to the PEITS_root folder.
cd ..
wget http://www.cs.sandia.gov/~kddevin/Zoltan_Distributions/zoltan_distrib_v3.8.tar.gz
tar xf zoltan_distrib_v3.8.tar.gz --warning=no-unknown-keyword
There may be some warning messages 'Ignoring unknown extended header keyword', these can be ignored. The Zoltan library can also be downloaded through http://www.cs.sandia.gov/~web1400/1400_download.html.
7. Configure and build the Zoltan library, must be run in a build subdirectory:
mkdir Zoltan_v3.8/BUILD_DIR
cd Zoltan_v3.8/BUILD_DIR
../configure --prefix=/home/username/PEITS_root/Zoltan_v3.8/BUILD_DIR --with-parmetis --with-parmetis-incdir="/home/username/PEITS_root/petscBUILD/include" --with-parmetis-libdir="/home/username/PEITS_root/petscBUILD/lib"
make everything
make install
As before, replace /home/username/PEITS_root with the PEITS path.
8. Download Parallel EIT Solver (PEITS) code into PEITS_root folder:
cd ../..
git clone https://github.com/EIT-team/PEITS
9. Edit the PEITS/config.opts_example file and change PETSCPATH and ZOLTANPATH to the appropriate directories by replacing /home/username/PEITS_root/ with the directory where PEITS_root is located, save the edited file as config.opts.
10. Run install script. Located in PEITS_root/PEITS/
sh INSTALL
The most likely cause of failure at this step is errors in the config.opts file, double check PETSCPATH and ZOLTANPATH if this step fails.
11. Test installation.
A simple regression test is includes in the PEITS/tests folder. This will check that the default configuration is producing the expected output (by comparing file sizes).
cd /home/tom/PEITS_root/PEITS/tests (change path as appropriate)
sh initial_test.sh
If the test is successful, file sizes match - test OK will be displayed, meaning the installation process should have been carried out correctly.
If there is an error regarding loading libparmetis.so then we need to add the petsc library directory to the environmental path:
LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/username/PEITS_root/petscBUILD/lib/ export LD_LIBRARY_PATH
As previously, replace /home/username/PEITS_root/ with the appropriate location.
11. Running PEITS. If installation has been completed successfully, the solver can be called from the PEITS_root/PEITS/src folder:
cd /home/tom/PEITS_root/PEITS/src (change path as appropriate)
mpirun -np 2 ./dune_peits
where -np specifies the number of parallel processors the solver should be run on.
If the solver needs an unreasonably long time for the assembly of the system matrix, then the pre-allocation of memory in PETSc needs to be adjusted. In file PEITS_root/PEITS/dune-fem-1.4.0/dune/fem/misc/petsc/petsccommon.hh the number of allocated non-zeros can be changed in the command MatMPIAIJSetPreallocation(mat,100,PETSC_NULL,40,PETSC_NULL). A safe way of adjusting this is to use very high numbers (e.g. 1000 and 150) and then running the solver with the option -info, which outputs the precise number of non-zeros required on the used mesh. Also, on some meshes ML preconditioning fails on some numbers of parallel processes. If this happens, either the number of processes can be changed or hypre preconditioning can be used.