gLite > gLite 3.1 > glite-MPI_utils > Update to glite-MPI_utils 3.1.1-0  
 
 

 

 

gLite 3.1

glite-MPI_utils - Update to version 3.1.1-0


Date 21.02.08
Priority Normal

Description


glite-MPI_utils

This is the first public release of the glite-MPI metapackage for gLite 3.1 including the YAIM configuration for MPI on the LCG CE and gLite WN. This module implements the configuration described at http://www.grid.ie/mpi/wiki/YaimConfig and caters for sites with a shared home file system or (via i2g-mpi-start) file distribution using ssh, mpiexec, or copying to a shared area for sites without a shared home file system.

On the CE, the installed implementations of MPI are published in the information system, along with other MPI-related tags.

On the WN, environment variables are configured to indicate the location of MPI implementations and a "dummy" mpirun script is created to work around limitations in the LCG RB and gLite WMS.

During the configuration the MPI targets, i.e. -n MPI_WN and -n MPI_CE, has to be the first on the YAIM command line.



Please also have a look at the list of known issues.

This update fixes various bugs. For the full list of bugs, please see list below.

Fixed bugs

Number Description
 #22970 CPU totals not configurable in YAIM
 #32786 config_lcgenv doesn't setup MYPROXY_SERVER
 #33018 [ YAIM ] config_gip_service_release should check whether the info provider file exists
 #33230 [ YAIM ] GLITE_LOCATION_VAR should be used in config_gip_only
 #33271 yaim configuration for 64bit Worker Node

Updated rpms

Name Version Full RPM name Description
glite-MPI_utils 3.1.1-0 glite-MPI_utils-3.1.1-0.i386.rpm gLite metapackage (glite-MPI_utils)
glite-yaim-core 4.0.3-13 glite-yaim-core-4.0.3-13.noarch.rpm glite-yaim-core
glite-yaim-mpi 0.1.6-3 glite-yaim-mpi-0.1.6-3.noarch.rpm The glite-yaim-mpi module configures MPI on the WN and LCG CE.
i2g-mpi-start 0.0.52-1 i2g-mpi-start-0.0.52-1.noarch.rpm A generic startup mechanism for different MPI installation in a cluster/grid.
mpich 1.2.7p1-2.slc4 mpich-1.2.7p1-2.slc4.i386.rpm mpich
mpiexec 0.82-1.slc4 mpiexec-0.82-1.slc4.i386.rpm mpiexec
torque-client 2.1.9-4cri.slc4 torque-client-2.1.9-4cri.slc4.i386.rpm client part of Torque
torque 2.1.9-4cri.slc4 torque-2.1.9-4cri.slc4.i386.rpm Tera-scale Open-source Resource and QUEue manager

The RPMs can be updated using yum via

Service reconfiguration after update

Not needed.

Service restart after update

Not needed.

How to apply the fix

  1. Update the RPMs (see above)
  2. Update configuration (see above)
  3. Restart the service if necessary (see above)