Skip to content

TheCNBCorner

Adrian Quintana edited this page Dec 11, 2017 · 1 revision

Installing XMIPP in...

Crunchy

Our new cluster: also calledthe humble number cruncher. We use openMPI.


# JAVA 1.6
export PATH=/gpfs/fs1/apps/jdk1.6.0_14/bin:$PATH
export JAVA_HOME=/gpfs/fs1/apps/jdk1.6.0_14

 ./scons.configure QTDIR=/usr/lib64/qt3 MPI_LIB=mpi MPI_LIBDIR=/usr/lib64  java=yes 
./scons.compile -j 8


Jumilla

Jumilla is our local Compaq Alphaserver cluster at the CNB. It has 5 ES45 nodes, each with 4 Alpha EV68 1 GHz CPUs and a Compaq Tru 64 5.1a operating system. This requires some adaptations of the standard configure command:


./scons.configure CXXFLAGS=-mieee QTDIR=/mnt/bioinfo/sharedfs/app/qt MPI_LIBDIR=/home2/bioinfo/bioinfo/MPICH/lib MPI_INCLUDE=/home2/bioinfo/bioinfo/MPICH/include MPI_CXX=mpiCC QT_LIB=qt TIFF_LIBDIR=/mnt/bioinfo/sharedfs/app/lib TIFF_INCLUDE=/mnt/bioinfo/sharedfs/app/include fast=1
./scons.compile


Important: should you get the error "Error 127" ignore it and executescons.compile again. You may need to repeat this several times

Teide

Teide is a Silicon GraphicsAltix3700Bx2 cluster with a Linux operation system based on Redhat. The peculiarity here is that you have to use the Intel icc compiler, as the Gnu compiler is not installed correctly.


./scons.configure CC=icc CXX=icc MPI_CC=icc MPI_CXX=icc MPI_LIB=mpi QTDIR=/usr/lib/qt-3.1 LINKERFORPROGRAMS=icpc MPI_LINKERFORPROGRAMS=icpc
./scons.compile


Cygwin

Not tested since version 0.9. SeeInstallXmipp09OnCygWin.

Claude

Claude is a 4-processor Linux (Ubuntu) machine with 32 Gb of RAM. However, its setup is a bit chaotic. There are three mpi installations and parts of qt are available at different places. We compiled our own copy of openmpi at /home/roberto/OpenMpi and this seems to work. Do not forget to add in the path


 
export PATH=/home/roberto/OpenMpi/bin:$PATH
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/roberto/OpenMpi/lib/


Then compile the package using:


 ./scons.configure  QTDIR=/usr/share/qt3 MPI_LIBDIR=/home/roberto/OpenMpi/lib/ MPI_INCLUDE=/home/roberto/OpenMpi/include/ QT_LIB=qt-mt  MPI_LIB='mpi'
./scons.compile


Okazaki

Okazaki is a 4-processor Linux machine with 28 Gb of RAM.


 ./scons.configure  QTDIR=/usr/share/qt3
./scons.compile -j 4


Trueno

Trueno is a cluster owned by the Spanish CSIC. Its operation system isRedHat Linux (release 4). It has two queues that send jobs to different systems:

  • The x86_64 queue sends jobs to a 14-node cluster, each node containing 2 Xeon 2GHz CPUs. These nodes have 2Gb each.

  
./scons.configure  MPI_LIBDIR=/opt/mpich-ch_p4-gcc-1.2.7/lib/ MPI_INCLUDE=/opt/mpich-ch_p4-gcc-1.2.7/include/  MPI_LIB='mpich' QTDIR=/usr/lib64/qt-3.3
./scons.compile
 


  • The ia64 queue sends jobs to a single-node cluster with 15 Itanium CPUs. These CPUs share a total of 64Gb of RAM. It seems that no qt-devel is instaled (qt libraries but not incleudes are at /usr/lib/qt-3.3)

 
 export PATH=/opt/mpich-ch_shmem-icc-1.2.7/bin/:$PATH
 ./scons.configure  MPI_LIBDIR=/opt/mpich-ch_shmem-icc-1.2.7/lib/ MPI_INCLUDE=/opt/mpich-ch_shmem-icc-1.2.7/include/  MPI_LIB='mpich' 
./scons.compile


Suse 10.3

If you install Xmipp without the parallel version of the programs just type:


./scons.configure
./scons.compile


If you want the MPI-versions as well, keep in mind that the MPI libraries in Suse 10.3 are not in a standard place


   
export PATH=$PATH:/opt/mpich/ch-p4/bin/
./scons.configure  MPI_LIBDIR=/opt/mpich/ch-p4/lib MPI_INCLUDE=/opt/mpich/ch-p4/include/  MPI_LIB='mpich' MPI_CC=/opt/mpich/ch-p4/bin/mpicc MPI_CXX=/opt/mpich/ch-p4/bin/mpiCC MPI_LINKERFORPROGRAMS=/opt/mpich/ch-p4/bin/mpiCC  JNI_INCLUDE=/usr/lib64/jvm/java-1.6.0-sun-1.6.0/include/
./scons.compile


Or using openmpi:


   
export PATH=$PATH:/usr/lib/mpi/gcc/openmpi/bin
./scons.configure  MPI_LIBDIR=/usr/lib/mpi/gcc/openmpi/lib MPI_INCLUDE=/usr/lib/mpi/gcc/openmpi/include MPI_LIB="mpi"
./scons.compile


Suse 11 64bits (Crater: gcc 4.3)


   
./scons.configure  MPI_LIBDIR=/usr/lib64/mpi/gcc/openmpi/lib64/ MPI_INCLUDE=/usr/lib64/mpi/gcc/openmpi/include/  MPI_LIB='mpi' 
./scons.compile


Suse 11 32 bits


   
export PATH=$PATH:/usr/lib/mpi/gcc/openmpi/bin
./scons.configure MPI_LIBDIR=/usr/lib/mpi/gcc/openmpi/lib MPI_INCLUDE=/usr/lib/mpi/gcc/openmpi/include MPI_LIB='mpi'  QTDIR=/usr/lib/qt3
./scons.compile


Suse 11.1

Package libstdc++33 is required (for libstdc++-5.0)

Opensuse 11.4 64 bits


   
export PATH=$PATH:/usr/lib64/mpi/gcc/openmpi/bin
./scons.configure MPI_LIBDIR=/usr/lib64/mpi/gcc/openmpi/lib MPI_INCLUDE=/usr/lib64/mpi/gcc/openmpi/include MPI_LIB='mpi'  QTDIR=/usr/lib/qt3
./scons.compile


Ubuntu 10.04

If your .bshrc starts with the following lines:


# If not running interactively, don't do anything
[ -z "$PS1" ] && return


Make sure to include the PATH and LD_LIBRARY_PATH to the xmipp installation before these lines, in order to be able to run with mpi.


/scons.configure QTDIR=/usr/share/qt3/ java=yes JNI_CPPPATH='/usr/lib/jvm/java-6-sun/include' QT_LIB=qt-mt  MPI_LIBDIR=/usr/lib/openmpi/lib/  MPI_INCLUDE=/usr/lib/openmpi/include/ MPI_LIB='mpi'



Ubuntu 11.04

./scons.configure QTDIR=/usr/share/qt3/ java=yes JNI_CPPPATH='/usr/lib/jvm/java-6-openjdk/include' QT_LIB=qt-mt MPI_LIBDIR=/usr/lib/openmpi/lib/ MPI_INCLUDE=/usr/lib/openmpi/include/ MPI_LIB='mpi'

Curienite

Fedora Core 4, cluster with 64 bit processors


   
./scons.configure MPI_LIB=mpi_cxx MPI_LIBDIR=/usr/lib64 CXXFLAGS=-L/usr/lib64
./scons.compile


Clark, Clar2, Clark3

clark 4 cores * 2 cpus, devian, 32 Mg memory clark2 4 cores * 2 cpus, devian, 64 Mg memory clark3 4 cores * 2 cpus, devian, 64 Mg memory


   
./scons.configure  MPI_LIBDIR=/usr/lib/ MPI_INCLUDE=/usr/include/  MPI_LIB='mpi' java=yes
./scons.compile -j 4 


Idris


   
  ./scons.configure CC=xlC_r CXX=xlC_r LINKERFORPROGRAMS=xlC_r CXXFLAGS=-lm MPI_CC=mpcc_r MPI_CXX=mpCC_r MPI_LINKERFORPROGRAMS=mpCC_r MPI_LIB='' MPI_INCLUDE=''
./scons.compile


Bioing4 (OpenSuse 10.3, x86 64 bits)


   
./scons.configure QTDIR=/usr/lib/qt3
./scons.compile


Lepus (Open Suse 10.3 , 64 bits processor)


   
./scon.configure QTDIR=/usr/lib/qt3=
./scons.compile


BSC

Magerit/MareNostrum/LaPalma

MareNostrum is the most powerful supercomputer in Europe, hosted by the Barcelona Supercomputer Center (Centro Nacional de Supercomputacion) [http:www.bsc.es]. Magerit is its smaller brother in Madrid,LaPalma is even smaller and situated on the Canary Islands. They are IBMBladeCenter JS21 Clusters, with powerPCs (PPC 970, 2.3 GHz) connected by a Myrinet network and a Linux operating system.

Using GNU compilers

Previously, some modifications were necessary (see theOldCompilationMareNostrum page, but now building xmipp on these machines is straightforward. Make sure you have the following environment variables set (in the ~/.bash_profile):


export CXX=g++
export CC=gcc
export MP_CXX=g++
export OBJECT_MODE=32
export MP_CC=gcc
export PATH=/gpfs/apps/PYTHON/2.5.2/32/bin/:/gpfs/apps/AUTOCONF/2.63/bin/:$PATH


And then, just type:


./scons.configure
./scons.compile


Using IBM (xlC) compilers

This code may be 30% faster. Somehow the executables have to be static. Thereby tiff and qt no longer work, but that may be acceptable with the gain in speed. One could also install two versions (one with gnu with tiff and qt). Make sure you have the following environment variables set (in the ~/.bash_profile):


export CXX=xlC_r
export CC=xlC_r
export F77=xlf_r
export MP_CXX=xlC
export OBJECT_MODE=64


And then, just type:


./scons.configure CC=xlC_r CXX=xlC_r LINKERFORPROGRAMS=xlC_r CXXFLAGS=-lm  MPI_LIB='' MPI_INCLUDE='' static=yes tiff=no gui=no MPI_CXX=mpicxx MPI_LINKERFORPROGRAMS=mpicxx
./scons.compile


CESGA

Finis terrae

SUSE Linux Enterprise Server 10 (ia64):

  • 142 nodes HP Integrity rx7640 with 16 cores Itanium Montvale and 128 GB

of memory (in each node?).

  • No qt or tiff
  • start interactive mode for compilation

 compute


  • start mpi enviroment for hp

 module load hp-mpi


NOTE that PBS files are different for GCC and ICC compilers.

Compilation using Intel C compiler (icc)

  • start environment for Intel compiler

 module load icc


  • configure

  ./scons.configure  MPI_LIBDIR=/opt/hpmpi/lib/linux_ia64 MPI_INCLUDE=/opt/hpmpi/include/linux_ia64  MPI_LIB='mpi' CC=icc CXX=icc
     


  • compile (Do not use-j) flag

 ./scons.compile 


  • IMPORTANT: GUI only available incompute mode

Compilation using gcc

  • configure

  ./scons.configure  MPI_LIBDIR=/opt/hpmpi/lib/linux_ia64 MPI_INCLUDE=/opt/hpmpi/include/linux_ia64  MPI_LIB='mpi' MPI_CC=gcc MPI_CXX=g++
     


  • compile (Do not use-j) flag

 ./scons.compile 


  • IMPORTANT: GUI only available incompute mode

Alternative using openmpi

  • do NOT execute

 module load hp-mpi


  • add path to mpirun in bashrc file

 
export LD_LIBRARY_PATH=/home/csic/eda/msp/OpenMPI/lib/:$LD_LIBRARY_PATH
export PATH=/home/csic/eda/msp/OpenMPI/bin:$PATH
     


  • configure

 ./scons.configure  MPI_LIBDIR=/home/csic/eda/msp/OpenMPI/lib/ MPI_INCLUDE=/home/csic/eda/msp/OpenMPI/include/  MPI_LIB='mpi' 
     


Note: I tried to compile with icc without success

CIC BioGUNE

Workstation HP xw8400

Double quad-core machine withRedHat 5.1 x86_64 and with openmpi installed:


export PATH=$PATH:/usr/lib64/openmpi/1.2.5-gcc/bin
./scons.configure QTDIR=/usr/lib64/qt-3.3/ MPI_LIBDIR=/usr/lib/openmpi/1.2.5-gcc/lib/ MPI_INCLUDE=/usr/lib64/openmpi/1.2.5-gcc/include/ MPI_LIB="mpi" QT_LIB=qt-mt
./scons.compile


Almeria

Vermeer

Vermeer is a HP Proliant DL360 G3 cluster at the University of Almeria. Each of 32 nodes has 2 Intel Xeon 3.06 GHz CPUs, each with 2Gb of RAM. The operating system is Gentoo Linux. Compiling Xmipp is straightforward:


./scons.configure
./scons.compile


Botero

  • ./scons.configure QTDIR=/usr/lib/qt3_64
  • ./scons.compile -j 8

Gaudi

  • ./scons.configure MPI_LIB=mpi MPI_LIBDIR=$HOME/lib * ./scons.compile -j 32

MPI, Martinsried

BlueGene

Remember that on theBlueGene machine you have are "cross-compiling". That is, the code generated by these compilers will only run on the working nodes, not on the user interface. Compilation required some patches to the 2.2 release that will be incorporated in the next release. For now, things worked with the repository version r3559 (13 Feb 2009). (Note that argumentsMPI_LIB"" MPI_LIBDIR="" MPI_INCLUDE=""= are essential. If they are not given, the programs cannot be executed, but give core dumps (somehow user-interface compiler-stuff gets mixed in).

Using IBMs XLC (bgxlc_r):


./scons.configure   CC=mpixlc_r CXX=mpixlcxx_r CXXFLAGS=-DMPICH_IGNORE_CXX_SEEK LINKERFORPROGRAMS=mpixlcxx_r MPI_CC=mpixlc_r MPI_CXX=mpixlcxx_r MPI_LINKERFORPROGRAMS=mpixlcxx_r MPI_LIB="" static=yes gui=no tiff=no warn=no FFTWFLAGS="CC=mpixlc_r CXX=mpixlcxx_r F77=mpixlf77_r" prefix=/u/thaller/BlueGene/xmipp/13feb09_xlc MPI_LIBDIR="" MPI_INCLUDE=""
./scons.compile


Using GNUs gcc:


./scons.configure CC=mpicc CXX=mpicxx CXXFLAGS=-DMPICH_IGNORE_CXX_SEEK LINKERFORPROGRAMS=mpicxx MPI_CC=mpicc MPI_CXX=mpicxx MPI_LINKERFORPROGRAMS=mpicxx MPI_LIB="" gui=no tiff=no mpi=yes static=yes warn=no FFTWFLAGS="CC=mpicc CXX=mpicxx CXXFLAGS=-DMPICH_IGNORE_CXX_SEEK" prefix=/u/thaller/BlueGene/xmipp/13feb09 MPI_LIBDIR="" MPI_INCLUDE=""
./scons.compile


Then one needs to modifyapplications/scripts/protocols/launch_parallel_job.py forBlueGeneAdaptedLaunchParralelJob. And do./scons.compile again. See alsoRunningXmippOnBlueGeneMartinsried

CIB-CSIC

Cluster Drug


  ./scons.configure MPI_LIBDIR=/opt/openmpi/1.1.1/lib/ MPI_INCLUDE=/opt/openmpi/1.1.1/include/ MPI_LIB="mpi"
  ./scons.compile


SGI ALtix 1300


./scons.configure CC=gcc CXX=g++ MPI_CC=mpicc MPI_CXX=mpicxx MPI_LIB=mpi MPI_LINKERFORPROGRAMS=mpicxx
./scons.compile


Equuleus


ADD mpirun PATH
./scons.configure  MPI_LIBDIR=/usr/lib64/mpi/gcc/openmpi/lib64  MPI_INCLUDE=/usr/lib64/mpi/gcc/openmpi/include MPI_LIB='mpi' QTDIR=/usr/lib/qt3
./scons.compile -j 8


Parc Scientific de Barcelona (OpenSuse 11.04 with MPICH)


ADD mpirun PATH
./scons.configure  MPI_CC=/opt/mpich/ch-p4/bin/mpicc MPI_CXX=/opt/mpich/ch-p4/bin/mpiCC MPI_LINKERFORPROGRAMS=/opt/mpich/ch-p4/bin/mpiCC MPI_INCLUDE=/opt/mpich/ch-p4/include/ MPI_LIBDIR=/opt/mpich/ch-p4/lib QTDIR=/usr/lib/qt3
./scons.compile -j 2


Clone this wiki locally