Simulations

From Electron Ion Collider

Jump to: navigation, search

The EIC task force has a large number of simulation tools available for investigating different types of physics processes. Unless noted otherwise, these can be accessed from /afs/rhic.bnl.gov/eic/PACKAGES.

Contents

Event Generators

The following event generators are available:

  • DJANGOH: DIS generator with QED and QCD radiative effects.
  • DPMJet: a generator for very low Q2/real photon physics in eA
  • gmc_trans: A generator for semi-inclusive DIS with transverse-spin- and transverse-momentum-dependent distributions.
  • LEPTO: A leptoproduction generator - used as a basis for PEPSI and DJANGOH
  • LEPTO-PHI: A version of LEPTO with "Cahn effect" (azimuthal asymmetry) implemented
  • MILOU: A generator for deeply virtual Compton scattering (DVCS), the Bethe-Heitler process and their interference.
  • RAPGAP: A generator for deeply inelastic scattering (DIS) and diffractive e + p events.
  • PYTHIA: A general-purpose high energy physics event generator.
  • PEPSI: A generator for polarised leptoproduction.

There is code provided to convert the output from most of these generators into a ROOT format. It is distributed as part of eic-smear, the Monte Carlo smearing package.

Detector simulations

The following programmes are available for simulating detector geometry and response:

  • eic-smear A package for apply very fast detector smearing to Monte Carlo events.
  • ESIM: An eRHIC detector simulation using GEANT3.
  • Geant4: C++ detector simulation, successor to GEANT3
  • FLUKA: Fortran-based transport code, including a graphical interface (Flair)

Manuals

See the pages of the programmes listed above for their documentation. Other useful references are:

  • BASES/SPRING v1 and v5.1: Cross section integration and Monte Carlo event generation. Used in Rapgap and MILOU.

Helpful/Important Links

The following pages provide useful general information for Monte Carlo simulations:

  • MC programs:
    • A list of Monte Carlo programmes
    • HepForge, high-energy physics development environment, which includes many Monte Carlo generators.
    • Lecture slides from a course on QCD and Monte Carlos



  • Parton Distribution Function Interfaces:
    • LHAPDF, the Les Houches Accord PDF Interface. Currently installed version 5.8.6. The 32-bit libs are at /afs/rhic.bnl.gov/eic/bin32/LHAPDF-5.8.6/lib, the 64-bit ones are at /afs/rhic.bnl.gov/eic/bin/LHAPDF-5.8.6/lib and the PDF-Grids can be found in /direct/eic+data/LHAPDF-5.8.6/lhapdf/PDFsets
    • The users' manual of the CERN PDFLIB

MC Analysis Techniques

How to get a cross section

to normalize your counts to cross section you need two informations

  • the total number of trials, it is printed to the screen/logfile if all our MC finish
  • the total integrated cross section, the unit is in general microbarn, it is printed to the screen/logfile if all our MC finish

Counts = Luminosity x Cross Section

==> count * total integrated cross section /total number of trials

to calculate the corresponding MC luminosity

==> total number of trials/ total integrated cross section


There are some handy ROOT functions available to get the total number of trials, the total integrated MC cross section and the total number of events in the Tree
These work on Pythia, Pepsi, Djangoh and Milou event-wise root trees

  • total number of trials:
TObjString* nEventsString( NULL );
file.GetObject( "nTrials", nEventsString );
  • total integrated MC cross section
TObjString* crossSectionString( NULL );
file.GetObject( "crossSection", crossSectionString );
  • total number of events in the tree:
TTree* tree( NULL );
file.GetObject( "EICTree", tree );
How to scale to the MC luminosity to the luminosity we want for the measurement

Very often it is impossible to generate so many events that the MC luminosity would correspond to one month of eRHIC running.
For this case we generate so much MC events that all distributions are smooth and scale the uncertainties.
The factor needed to scale is the ratio lumi-scale-factor = eRHIC-luminosity / generated MC luminosity. If we have this factor there are 2 ways to scale.

  • scaling of counts in histogram by
h11->Scale(lumi-scale-factor); 

this will scale the number of counts in each bin of the histogram to what you would get for the eRHIC-luminosity
statistical uncertainties can then be calculated simply by sqrt(counts)

  • scaling the statistical uncertainties only
sqrt(counts)/sqrt(lumi-scale-factor)
Example: reduced cross section

This example shows how to calculate the reduced cross section need to extract F_2 and F_L and how to scale the statistical uncertainties to a certain integrated luminosity

sigma_reduced =   prefactor * dsigma/dx/dQ2 with prefactor = Q^4 * x / (2*pi*alpha_em^2*(1+(1-y)^2)

this cross section would have the unit barn * GeV^2, to make it dimensionless you need to use a conversion factor for barn to 1/GeV^2 (h^2c^2/GeV^2 = 0.3894 millibarn)

sigma_reduced = counts(x,Q^2) * prefactor * total integrated MC cross section /total number of trials/ conversion-barn-to-GeV /x-binsize/Q2-binsize 
if the root function Scale was used the statistical uncertainty is
delta sigma_reduced = sqrt(counts(x,Q^2)) * prefactor * total integrated MC cross section /total number of trials/ conversion-barn-to-GeV /x-binsize/Q2-binsize
in the other case it is
delta sigma_reduced = sqrt(counts(x,Q^2)) * prefactor * total integrated MC cross section /total number of trials/ conversion-barn-to-GeV /x-binsize/Q2-binsize/ sqrt(lumi-scale-factor)

Attention: all luminosities and cross section must be in the same unit (pb or fb or ...)

Personal tools