Example Usages:
- Other examples may be found in config/*.py
- Using the bash shell, assuming BLAS,
LAPACK, MPICH are not currently installed configure will download &
install BLAS, LAPACK, MPICH if they are not already installed on the
system) :
export
PETSC_DIR=/home/petsc/petsc-2.2.1
cd $PETSC_DIR
./config/configure.py
--download-f-blas-lapack=ifneeded --download-mpich=ifneeded
make
make test
- Using the bash shell, assuming BLAS,
LAPACK, MPICH software are installed at the specified locations:
export
PETSC_DIR=/home/petsc/petsc-2.2.1
cd $PETSC_DIR
./config/configure.py
--with-blas-lapack-dir=/usr/local/lib --with-mpi-dir=/usr/local/mpich
make
make test
- Using tcsh shell, install PETSc with default BLAS and
LAPACK and no MPI:
setenv
PETSC_DIR /home/petsc/petsc-2.2.1
cd $PETSC_DIR
./config/configure.py
--with-mpi=0
make
make test
- Install PETSc with default BLAS, LAPACK and MPI using the
GNU compilers, build the optimized version of the libraries:
./config/configure.py
--with-vendor-compilers=0
make BOPT=O
make BOPT=O test
- Using csh shell, install PETSc on Linux with two different
sets of compilers:
setenv
PETSC_ARCH linux-gnu
./config/configure.py
--with-vendor-compilers=0 --with-mpi-dir=/usr/local/mpich-gnu
make BOPT=O
make BOPT=O test
setenv PETSC_ARCH
linux-gnu-intel
./config/configure.py
--with-vendor-compilers=intel --with-gnu-compilers=0
--with-blas-lapack-dir=/usr/local/mkl
--with-mpi-dir=/usr/local/mpich-intel
make BOPT=O
make BOPT=O test
Several variables control the configuration and build process
of PETSc. They can either be given as arguments to make or be set as environment variables. In
particular, the are:
PETSC_DIR:
- this
environment/make variable should point to the location of the PETSc
installation that is used. You can add export
PETSC_DIR=value in your .profile or .sh file or setenv PETSC_DIR value in your .cshrc or
.tcshrc
file. Multiple PETSc versions can coexist on the same
file-system. By changing PETSC_DIR one can switch between the
versions
PETSC_ARCH: this
environment/make variable is used to specify the configuration that
should currently be used. It corresponds to the configuration files
in bmake/${PETSC_ARCH}. Multiple variants of PETSc libraries can be
installed - each variant corresponding to a different PETSC_ARCH. One
can switch between using these variants - by changing the value of
PETSC_ARCH. If PETSC_ARCH is not set, the configuration from the last
time you ran ./config/configure.py will
be used.
BOPT: this
environment/make variable is used to distinguish debug/optimized or
c/c++/complex versions of PETSc libraries. One can install, any/all
variants of the libraries, and switch between them by changing the
value of BOPT. If BOPT is not set then the value of g will be used.
The table below explains the most common BOPT values.
| g |
debug version of PETSc libraries -
useable from C, fortran |
| g_c++ |
debug version of PETSc libraries -
useable from C++, fortran |
| g_complex |
debug complex version of PETSc
libraries - useable from C++, fortran |
| O |
Optimized version of PETSc libraries
- useable from C, fortran |
| O_c++ |
Optimized version of PETSc libraries
- useable from C++, fortran |
| O_complex |
Optimized version of PETSc libraries
- useable from C++, fortran |
example usage - assuming ex1.c is user code, and a makefile
with a target to compile ex1 exists:
make BOPT=g PETSC_ARCH=linux
PETSC_DIR=/home/petsc/petsc-2.1.5 ex1
Return
to Installation Instructions Return to Manual Installation Instructions
bmake/${PETSC_ARCH}/packages: this
configuration file, corresponding to the PETSC_ARCH used, contains the
configuration for all external software packages that PETSc is
installed
with. The specification is in the compiler notation for include
paths, library paths, library names.
| Compiler flag |
Example |
Explanation |
| -D |
-DPETSC_HAVE_X11 |
define a macro |
| -I |
-I/home/petsc/mpich-1.2.5/include |
specify include paths to the compiler |
| -L |
-L/home/petsc/mpich-1.2.5/lib |
specify the library paths to the
compiler |
| -l |
-lmpich |
specify the name of the library to
link to- if the library is libmpich.a |
| filename |
/home/petsc/foo.a |
specify the library to link to (which
cannot be specified using -l) |
bmake/${PETSC_ARCH}/variables:
this
file contains the configuration for the C, C++, fortran compilers -
that are associated with the PETSC_ARCH used.
Return
to Installation Instructions Return to Manual Installation Instructions
Cygwin provides UNIX tools on Microsoft
Windows. When installing Cygwin make sure
you install the following additional
packages
Run /bin/rebaseall after install and use bash as the working shell.
Cygwin also has GNU
compilers ( gcc, g++, g77) which can be
used if Microsoft, Intel, or Borland Group compilers are not
available
Note: To insure success of rebaseall - make sure there are no
other cygwin processes (except bash) are running like ssh or sshd etc.
Return
to Installation Instructions Return to Manual Installation Instructions
BLAS/LAPACK:
these packages provide some basic numeric kernels used by PETSc.
- If ./config/configure.py
cannot locate BLAS and LAPACK you can use the options --with-blas-lapack-dir=directory or --with-blas-lib=library and --with-lapack-lib=library or --with-blas-lapack-lib=library to indicate the
location of the libraries.
- on Microsoft Windows and Linux, one can use Intel MKL
- On AIX (IBM) ESSL, on OSF (Compaq/alpha) DXML, on
Solaris Sunperf, on SGI libcomplib.sgimath.
- On some Linux boxes, blas/lapack is installed
[/usr/lib/liblapack.a /usr/lib/libblas.a ]
- Alternatively download the source distribution
fblaslapack or cblaslapack from PETSc downloadpage
- Use the ./config/configure.py
options --download-f-blas-lapack=yes, --download-f-blas-lapack=ifneeded, then PETSc
will download and install a BLAS and LAPACK for you.
- Use --download-c-blas-lapack (as
opposed to --download-f-blas-lapack) only
if there is no fortran compiler.
Return
to Installation Instructions Return to Manual Installation Instructions
MPI:
This software provides the parallel functionality for PETSc.
- If ./config/configure.py
cannot locate MPI you can indicate its location with --with-mpi-dir=directory or --with-mpi-include=directory --with-mpi-lib=libraryname --with-mpi-run=pathofmpirun
- On parallel machines, Vendor provided MPI might already
be installed. IBM, SGI, Cray etc provide their own.
- If vendor provided MPI is not available, it can be
installed from sources. We recommend MPICH
- Use the ./config/configure.py
option --download-mpich=yes then PETSc
will download and install MPICH for you.
Return
to Installation Instructions Return to Manual Installation Instructions
I
don't want to use MPI: You can build (sequential) PETSc
without an MPI.
Return
to Installation Instructions Return to Manual Installation Instructions
Optional
Packages: PETSc can be installed with
various optional packages - including SuperLU, Spooles, Matlab
etc..
The optional package information is specified in
bmake/${PETSC_ARCH}/packages file. An example specification is as
follows:
PARMETIS_INCLUDE = -I/home/petsc/soft/solaris-9/ParMetis-3.0 PARMETIS_LIB = -L/home/petsc/soft/solaris-9/ParMetis-3.0 -lparmetis -lmetis PETSC_HAVE_PARMETIS = -DPETSC_HAVE_PARMETIS
Notice that the include, library paths, library files are
specified, and a flag is set - to enable the package within PETSc.
For more example usage for any other package, check out the default
bmake/*/packages files in the distribution
Return
to Installation Instructions Return to Manual Installation Instructions
Compilers: Support
Microsoft (win32_ms*), Intel (win32_intel), Borland (win32_borland),
Gnu compilers (win32_gnu)
Project Files: We provide
default project files that work with PETSC_ARCH=win32_ms_mpich. They
are located at ${PETSC_DIR}/projects. Be sure to build the libraries
that correspond to the project configuration you are using. To use a
C++ project with Debug configuration, BOPT=g_c++ libraries are required.
Debugger: Running
PETSc probrams with -start_in_debugger is not supported on this
platform, so debuggers will need to be initiated manually. Make sure
your environment is properly configured to use the appropriate debugger
for your compiler. The debuggers can be initiated using Microsoft
Visual Studio 6: msdev ex1.exe, Microsoft Visual Studio
.NET: devenv ex1.exe, Intel Enhanced Debugger: edb ex1.exe,
or GNU Debugger gdb ex1.exe.
PETSc Win32 front end - win32fe:
This tool is used as a wrapper to Microsoft/ Borland/ Intel compilers
and associated tools - to enable building PETSc libraries using make
and other UNIX tools. For additional info, run
${PETSC_DIR}/bin/win32/win32fe without any options.
Return
to Installation Instructions Return to Manual Installation Instructions
Manual Installation:
- apply the patches
(using GNU patch utility)
cd
petsc-2.2.1
patch -Np1 <
petsc_patches_all_2.2.1
- sh/bash shell PETSC_DIR=`pwd`;
export PETSC_DIR
(note the backquote)
csh/tcsh shell setenv PETSC_DIR
`pwd`
- Run bin/petscarch -suggest
and from the list of choices select the one that most closely resembles
your system, call it arch
- Select a name for your configuration, we recommend arch followed by _local.
For example, linux_local
- sh/bash shell
PETSC_ARCH=chosen
name; export PETSC_DIR
csh/tcsh shell
setenv PETSC_ARCH
chosen name
- mkdir bmake/${PETSC_ARCH}
- cp bmake/predefined/arch/*
bmake/${PETSC_ARCH} (ignore possible warning messages about not
copying directories)
- Edit bmake/${PETSC_ARCH}/packages
and specify the location of BLAS, LAPACK
and MPI. I Don't want MPI
- Add optional packages to
the bmake/${PETSC_ARCH}/packages
- make BOPT=g all
- make BOPT=g test
Examples:
- install PETSc on solaris manually by modifying
the configuration files with Sun compilers
export
PETSC_DIR=/home/petsc/petsc-2.2.1
cd $PETSC_DIR
bin/petscarch -suggest
PETSC_ARCH=solaris_local;
export PETSC_ARCH
mkdir bmake/solaris_local
cp bmake/solaris/*
bmake/solaris_local
edit bmake/solaris_local/packages
- and
update the location of BLAS, LAPACK, MPI
make
BOPT=g
make BOPT=g test
- install PETSc on Microsoft Windows, with MPICH-NT,
Microsoft
Compilers, Intel MKL
install cygwin package and use 'bash' from cygwin as the
working shell
export
PETSC_DIR=/home/petsc/petsc-2.2.1
cd $PETSC_DIR
bin/petscarch -suggest
export
PETSC_ARCH=win32_ms_mpich_local
mkdir
bmake/win32_ms_mpich_local
cp bmake/win32_ms_mpich/*
bmake/win32_ms_mpich_local
edit bmake/win32_ms_mpich_local/packages
and make sure the paths to MPICH-NT, Intel MKL (BLAS,LAPACK) are
correct
make
BOPT=g
make BOPT=g test
|