setup on lxplus * your account should be able to run athena "hello world" as discribed in the workbook [[https://twiki.cern.ch/twiki/bin/view/Atlas/WorkBookRunAthenaHelloWorld]] * login to lxplus * start athena > source ~/cmthome/setup.sh -tag=12.0.4 * start the grid > source /afs/cern.ch/project/gd/LCG-share/sl3/etc/profile.d/grid_env.sh > source /afs/cern.ch/project/gd/LCG-share/sl3/etc/profile.d/grid_env.sh > export LFC_HOST=lfc-atlas.cern.ch > export LCG_CATALOG_TYPE=lfc > voms-proxy-init * create a long term proxy (recommended for mc production, default 1 week) > myproxy-init * check with > myproxy-info * the long term proxy lies on "myproxy.cern.ch" and is automatically used by ganga * start dq2 (dataset mangagement) > source /afs/usatlas.bnl.gov/Grid/Don-Quijote/dq2_user_client/setup.sh.CERN * start ganga (4.2.7) as atlas user in graphical user interface with > /afs/cern.ch/sw/ganga/install/slc3_gcc323/4.2.7/bin/ganga --config-path=GangaAtlas/Atlas.ini --gui > /afs/cern.ch/sw/ganga/install/slc3_gcc323/4.2.7/bin/ganga --config-path=GangaAtlas/Atlas.ini --gui * type "y" to create a default ganga config file ".gangarc" * ganga should start * open a second shell, and type > source ~/cmthome/setup.sh -tag=12.0.4 > source /afs/cern.ch/project/gd/LCG-share/sl3/etc/profile.d/grid_env.sh > source /afs/cern.ch/project/gd/LCG-share/sl3/etc/profile.d/grid_env.sh > export LFC_HOST=lfc-atlas.cern.ch > export LCG_CATALOG_TYPE=lfc > voms-proxy-init > source /afs/usatlas.bnl.gov/Grid/Don-Quijote/dq2_user_client/setup.sh.CERN > source /afs/usatlas.bnl.gov/Grid/Don-Quijote/dq2_user_client/setup.sh.CERN * change into a run directotory > cd testarea/12.0.4/PhysicsAnalysis/AnalysisCommon/UserAnalysis/run setup at DESY Zeuthen * start athena > source /usr1/scratch/userscratch/USERNAME/cmthome/setup.sh -tag=12.0.3 * start the grid > source /afs/desy.de/group/grid/UI/GLITE-3_0_4/etc/profile.d/grid_env.sh > export LFC_HOST=lfc-atlas.cern.ch > export LCG_CATALOG_TYPE=lfc > voms-proxy-init * log with another shell into lxplus and create a long term proxy with "myproxy-init" as described above (myproxy-init at atlhltX fails) * start dq2 > source /afs/ifh.de/project/grid/dq2/setup.sh * start ganga > ini ganga > ganga --config-path=GangaAtlas/Atlas.ini --gui hello world with Ganga in graphical user interface * you should contain "HelloWorldOptions.py" in your run directory * click on "new" in the ganga window to create a new job * click on application and change from "Executable" to "Athena" in the drop down box on the right side * expand the application menu * click on "option file" and browse on your disk to "HelloWorldOptions.py" to load it into ganga * click on "max events" and write "10" * click on "backend" and change from "local" to "LCG" * click on "submit" button * inspect the status of the job from "submitted", "running" to "completed", hopefully not to "failed" * retrieve the output by clicking on the job with the right button and choose "retrieve output", click on the file you want to read * you can also inspect input/output directly in ~/gangadir/workspace/... generating events (1. step in the chain) * the normal way would be to download in the second shell the atlas production transform archive, actual releases can be found here > wget http://cern.ch/atlas-computing/links/kitsDirectory/Production/kits/AtlasProduction_12_0_31_08_noarch.tar.gz * by default you have to create minimum 5000 events, but for educational test and to save time, we create 3 events, * creating more than 3 is simple and strait forward (but takes longer time) * you can change the default of 5000 events as discribed here * to save time download a file already changed down to 3 events (default is only changed for 1 example: "DC3.005711.PythiaB_bbmu6mu4X.py") > cp "/afs/cern.ch/user/v/volkmann/public/AtlasProduction_12_0_31_8_noarch.tar.gz" . * click on new in the ganga window * change "application" to "AthenaMC" * type into atlas release "12.0.31" * type into eventgen_job_option "DC3.005711.PythiaB_bbmu6mu4X.py" * type into firstevent "1" * make sure that mode is "evgen" * type into number_events_job "3" * type into process_name something meaningful like "bbmu6mu4X" * type into production_name another meaningful thing like "bphys" * change the random seed to a 9 digit number * type in a 6-digit run number like "000001" * browse the transform script "AtlasProduction_12_0_31_8_noarch.tar.gz" into ganga * type into transform_script "csc_evgen_trf.py" * click on backend and choose "LCG" * expand LCG and choose one of the following computing elements ** tbn20.nikhef.nl:2119/jobmanager-pbs-qlong ** lcg-ce0.ifh.de:2119/jobmanager-lcgpbs-atlas ** grid-ce1.desy.de:2119/jobmanager-lcgpbs-atlas * (you find all by typing "lcg-infosites --vo atlas ce" into the shell) * click on outputdata and select "DQOutputDataset" * click on splitter and choose "AthenaMCSplitterJob" * click on numsubjobs and type "3" * click on nsubjobs_inputfile and type also "3" * submit the job * inspect the output in the other open shell with dq2 if the 3 jobs reach "completed" (sometimes dq2 doesn't work at atlhltX) * you find the output datasetname in ganga > dq2_ls -g datasetname * you should see the three root files (each has 1 event in this example) simulating and digitizing events (2. and 3. step in the chain) * create a new job, and choose again "AthenaMC" in the application form * type into the form the atlas_release, prozess_name, production_name and run_number of the previous generation job * type into first_event "1" * type into number_events_job "1" * choose a 9-digit random seed * browse the job transform archieve again into the job "AtlasProduction_12_0_31_8_noarch.tar.gz" * type into transform_skript "csc_simul_trf.py" * choose in backend "LCG" and type a ce into it (as previously mentioned) * click on inputdata and choose "DQ2OutputDataset" and expand the menu * type into datasetname the name of the dataset from the previous job * click on outputdata and choose "DQ2OutputDataset" * choose in Splitter "AthenaMCSplitterJob" and expand * type into n_subjobs_inputfile "1" * type into number_input_jobs "3" * type into numsubjobs "3" * submit the job and inspect the result with dq2_ls -g datasetname reconstructing events (4. step in the chain) * everything remais the same to simulation exept: * change mode to "recon" * type into transform_skript "csc_reco_trf.py" * type into the inputdata the corresponding name of your simulation dataset * check the output dataset with "dq2_ls -g datasetname" * download the dataset with "dq2_get datasetname" and inspect with root links * http://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/BPHYSICS/underlying/grid/gangainstructions.htm * http://gridui02.usatlas.bnl.gov:25880/server/pandamon/query?overview=dslist * https://twiki.cern.ch/twiki/bin/view/Atlas/AtlasEvgenTrf * http://www.cern.ch/ganga