%INCLUDE{"WebHome" section="Virgotemplate_eng"}% ---++!! How to run analysis pipelines at Bologna (CNAF) computer center %TOC% ---+++ Introduction The bologna computer center runs LSF batch system and is part of the EGEE Grid. Using the concept of the so called [[http://www.grid.kfki.hu/twiki/bin/view/RmkiVirgo/UsingTheEGEEGrid][pilot jobs]] a virtual condor metacluster is maintened which is accessible from the User Interface machines (see below), so CBC condor pipelines can be submitted to it. Follow the procedure and if you have questions, send an e-mail to: <verbaim> Gergely.Debreczeni@cern.ch </verbatim> ---+++ The procedure Perform the following steps: 1 Logg in to one of the following machines <verbatim> ui01-virgo.cr.cnaf.infn.it. ui02-virgo.cr.cnaf.infn.it. </verbatim> These are the User Interface machine which shares the following directories with the Worker Nodes: <verbatim> /storage/gpfs_virgo3/home /storage/gpfs_virgo3/scratch /storage/gpfs_virgo3/virgo </verbatim> If possible use the scratch area, since it is world readable. 1 Change to one of the shared directories. <verbatim> (cmd.: cd /storage/gpfs_virgo3/home/gdebrecz) </verbatim> 1 *IMPORTANT:* Set the file creation mask to 0002, i.e. the files you are creating from now on will be group writeable. <verbatim> (cmd.: umask 0002) </verbatim> 1 Create a new directory, and enter into it. <verbatim> (cmd.: mkdir mywd; cd mywd) </verbatim> 1 Source VIRGO environment setting file: <verbatim> (cmd.: . /opt/exp_software/virgo/lscsoft/etc/virgoenv.sh) </verbatim> 1 Launch some condor test job and check whether it works fine or not. <verbatim> (cmd:. cp /storage/gpfs_virgo3/home/gdebrecz/condortest/* . condor_submit vog-testjob.des ) </verbatim> 1 If the testjobs finished OK, then from this point on you can do everything in the same way as on other clusters. Except the fact, that you have to copy the <verbatim> /opt/exp_software/virgo/lscsoft/etc/LSCdataFind </verbatim> executable and overwrite the one comes with the standard LSCsoft installation ! *BE AWARE*, that all your files and software _should be installed on the shared areas_ and should be group read/writeable so that your jobs on the Worker Nodes - which are running as a different user - could access it ! ---+++ Useful remarks 1 You can find a test installation as an example here: <verbatim> /opt/exp_software/virgo/lscsoft/ </verbatim> for S5 GRB runs or under /storage/gpfs_virgo3/home/gdebrecz/lscsoft/ /storage/gpfs_virgo3/home/gdebrecz/lscsoft/non-lsc </verbatim> for the s6_20090722 branch. <verbatim> /opt/exp_software/virgo/lscsoft/etc/s5grbenv.sh </verbatim> to see how I source the various files. 1 The system python installation should not be used, use the one installed in the above software area , this is automatically happens if you source the <verbatim> /opt/exp_software/virgo/lscsoft/etc/virgoenv.sh </verbatim> file 1 An automated software installation script is available under <verbatim> /storage/gpfs_virgo3/home/gdebrecz/scripts </verbatim> See the description inside on how to use it. %INCLUDE{"WebHome" section="Virgofooter_eng"}%
This topic: RmiVirgo
>
WebHome
>
VirgoComputing
>
CBCatBologna
Topic revision: r3 - 2009-09-30 - GergelyDebreczeni
Copyright &© by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki?
Send feedback