Offline Trigger Shifter Guide


Offline trigger shifts are an important part of data validation. The shifter analyzes the performance and the data quality of the L1 and HLT trigger within a day from the data taking and reports the findings in the Run Registry, e-log, and if needed, directly to an on-call expert.

An offline trigger shift usually needs about 4 hours and can be done at any time of the day. The shifter is supposed to review all the runs taken since the last run certified by the previous shifter. Shifts can be done from any CMS Center (CERN Meyrin, Fermilab, DESY) or home institute.

If you would like to volunteer for offline trigger shifts, please contact Manfred Jeitler (

Shifter Instructions

March 26 - Please state in your elog entry whether or not you agree with how each run was marked in the run registry.

Your shift report will be regulary checked by experts. Please start it with a general summary of run conditions and fill it on run-by-run basis. If you are in doubt or you think your observation is important for current data taking, e.g. the problem appeared and was not resolved please notify TPG On-Call person and put a message in correspondent ELOG (Trigger or DQM).

Please follow REPORTING section to properly organize your report!

Starting your shift

  1. Keep opening this page in separated windows when starting your shift, using links open independent sessions for ELOG, GUI etc
  2. To be able to open DQM GUI you need to have a GRID certificate:
  3. First, find out what was the last run analyzed by the previous shifter.
    • You can check that from the e-log.
  4. Please also check following areas, if you find some information relevant to your shift put it in your summary The information which may be important: trigger problems, DQM problems, general CMS problems, special beam/running condition, like splash or cosmics or single beam running
    • Subsystems -> HLT on call
    • General -> Shift
    • Subsystems -> Event Display and DQM
    • Subsystems -> Trigger
  5. Find out the run numbers of the physics runs taken after the previous shift.
    • You can find a list of runs that were taken in the last 24 hours here. From this list you need to select the runs that
    • the sequence has the word GLOBAL-RUN
    • has more then 100k events
    • the run was included by DQM in the list of runs for certification (runs which you will see in the next step)
    • keep the window opened you need it for further checks
  6. Look at the run registry to check how the online and offline shifters marked those runs for the HLT/L1T.
    • (to get at this from offsite you need to setup a SOCKS tunnel as described at .
    • If HLT/L1T marked BAD click on it to see the comments provided by the DQM shifter. You need to confirm this observation.
      • The instructions which are used by DQM are linked at the bottom of this page, you need to open them and try to find the check which failed.
      • Please explicitly state in your elog post if you agree with how the run was marked, and why you think so.

Instructions how to perform shifter analyses

Checks related to L1 trigger information

  1. In the window you opened in step 5 above select a run and click on runnumber
  2. Check that the Trigger Key has no mistakes by clicking on L1Key . New windows opens
  3. Using the same page check that the overall L1 rate is stable
    • The overall L1 rate against time can be displayed by clicking on the blau rate value (right column) for "L1APhysics".
    • Check there are no spikes or discontinuities along the run at the level of >15%..
  4. Using the same page check the rate of individual triggers
    • The rate vs time for individual bits are shown by clicking on links to the right (blue numbers) of the trigger algorithm names.
    • If the run is of type cosmics or collisions, check rates for the following algorithms and technical triggers, only if they are active (bold):
      • L1_SingleMuOpen
      • L1_SingleMu7_Barrel
      • L1Tech_RPC_TTU_pointing_Cosmics.v0
    • If the run is of type collisions, check also the rates for the following algorithms and technical triggers:
      • L1_SingleMuBeamHalo
      • L1_SingleJet52
      • L1_SingleJet80_Central
      • L1_SingleEG12
      • L1_BscMinBiasOR_BptxPlusANDMinus
      • L1Tech_BPTX_plus_OR_minus.v0
      • L1Tech_HCAL_HF_coincidence_PM.v1
      • L1Tech_BSC_minBias_threshold2.v0
    • For Heavy-Ion running at the end of 2010, a few other good triggers to check are:
      • L1_SingleMu3_BptxAND
      • L1_SingleJet30U_BptxAND
      • L1_BptxXOR_BscMinBiasOR
      • L1_SingleEG5_BptxAND
    • Note which of individual triggers demonstrate the similar spikes or discontinuities as those observed in L1APhysics rate or if all of them change at the same time.
  5. Check data-emulator comparison
    • Currently in Online DQM: - open and keep the GUI window opened for next runs
      • Click Summary and select the workspace Summary. Check the L1TEMU plot. Ignore systems in white, note any that are red or yellow.
      • If the run is of type cosmics, check the following DQM plots:
        • L1T/L1TGMT/GMT_etaphi: GMT occupancy should look like the following histogram. Note that the barrel has bottom sectors only, endcaps have full disks. GMTOccupancy.png
        • L1T/L1TGMT/bx_DT_vs_RPC: GMT synchronization between DTs and RPCs. The maximum should be at BX=(0,0). Some spread at neighbouring BXs is allowed. GMTTiming.png
  6. Only if requested by the on-call expert, produce L1 ntuples for particular run/datasets
    • You can find the instructions here

Checks related to HLT information

  1. Check stability of HLT rates using online DQM (opened above for L1 checks
    • HLT counts as function of lumi sections (LS): Workspace "HLT" "(Top) /HLT /FourVector /PathsSummary / HLT LS" . The colors should be uniform as a function of LS for each trigger paths indicating that the rate of the path was stable. Look for abrupt changes in total HLT rate (Any HLT) and in individual HLT rates.
    • Detailed information for each path as 1D can be found in "Paths"
    • Identify paths (if any) that caused unstable rate. Note this in your shift summary.
  2. Check stability of rates in the ExpressStream using Offlne DQM, currently in
    • The link "Data" in the top left corner should read "/StreamExpress/XXXX/YYYY". If not, select the appropriate dataset.
    • In the workspace "!HLT" "(Top) /HLT /FourVector /PathsSummary / HLT LS/" please perform the check as above
    • Look in "Paths" at the the histogram "HLT_Any". It is the histogram of total HLT accepts in bins of LS. In your shift summary, please record the average or stable value and the trend of this histogram. (Example : Average HLT_Any count is 350, decreasing from 6000 to 350 smoothly from the beginning of the run to LS = 35, with spikes at LS = 25 , 35 , 125)


During and at the end of your shift, please write a shift summary in the e-log.

Your shift summary should start with

    • period of data taking time which you have considered: 10.05 18:00 - 11.05 15:00;
    • region of runs during this period of time : XXXXX - YYYYY ;
    • a general summary: you should describe what kind of data was taken (colliding beams, single beam running, splashes, cosmics, etc.) and summarize your findings from other e-logs (trigger, DQM or general CMS problems during the runs that you analyzed);
    • list of runs, or period of runs with different conditions: XXXX-XXX cosmics, XXXX-XXXX colliding beam etc, this infromation you get from the general shift reports and DQM reports, if any run was marked BAD you also put it here;
    • after that you start detailed description on run-by-run basis You should list the runs that you analyzed and the problems you observed in these runs. You can find an example of the shift summary here.

Please notify primary TPG On-Call person by email if you find something which requires immediate fix and put a message in correspondent ELOG (Trigger or DQM).


The DQM checks

-- IoannisPapadopoulos - 2011-07-07

Topic revision: r1 - 2011-07-07 - IoannisPapadopoulos
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2023 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback