General purpose analysis software for (SI)DIS at the EIC
This repository provides a set of common tools for the analysis of both full and fast simulations, including the following features:
- General event loops for reading upstream data structures; for example,
src/AnalysisDelphes.cxxfor reading Delphes trees - Kinematics reconstruction methods (e.g., leptonic, hadronic, Jacquet-Blondel,
etc.)
- see Kinematics Documentation for more information
- see Jet Kinematics Documentation for jet kinematics
- Calculations of SIDIS variables, such as
PhiHandqT, for single particles, as well as jet variables - Automation for downloading or streaming simulation data from S3, along with the capability to combine data from varying Q2 ranges using weights
- Ability to specify arbitrary multi-dimensional binning schemes and cuts using Adage
- Output data structures include multi-dimensionally binned histogram sets,
tables, and
TTrees - An analysis is primarily driven by macros, used to set up the binning and other settings
If you prefer to use your own analysis code, but would still like to make use of the common tools provided in this repository (e.g., kinematics reconstruction), this is also possible; you only need to stream the data structure you need, most likely within the event loop. In this situation, it is recommended you fork the repository (pull requests are also welcome).
Here is a flowchart showing the main classes (underlined) and the connections to upstream simulation output:
First, clone this epic-analysis Github repository:
git clone git@github.com:eic/epic-analysis.git # if you have SSH permission
git clone https://github.com/eic/epic-analysis.git # if you do not have SSH permissionThis will create the directory epic-analysis, which you can then cd into.
These are common dependencies used for the upstream simulation, some of which
are needed for epic-analysis as well.
Follow the EIC Software Environment Setup Guide to obtain and install the EIC software image.
- The
eic-shellscript is used to start a container shell - This image contains all the upstream dependencies needed for EIC simulations
- All documentation below assumes you are running in
eic-shell
If you upgrade your image (eic-shell --upgrade), you may need to clean build
everything: make all-clean && make
These are additional dependencies needed by epic-analysis; they will be built
locally and stored in the deps/ directory (see deps/README.md
for more details). This section documents how to obtain and build local dependencies:
Delphes is the only local dependency that
is not mirrored in deps/, so you must download and build it first:
deps/install_delphes.sh- Alternatively, if you already have a
delphesbuild elsewhere, symlinkdeps/delphesto it - All other dependencies in
deps/are mirrors, and are already included withepic-analysis; they will be built automatically later
While you are waiting for Delphes to build, you may want to:
- Prepare to analyze some data from S3, following s3tools documentation
- Read through the
Kinematicsclass header and source, along with documentation, to see what physics reconstruction methods are available - Tutorial macros in the
tutorial/directory, to learn how to runepic-analysis
First, set environment variables:
source environ.shThen compile analysis-epic (and some other local dependencies):
make- We have not yet upgraded to
cmakein this repository, and still useMakefiles - Build target locations are not yet configurable, and all will stay within
epic-analysis(e.g., libaries will be installd inlib/) - Additional
maketargets are available (seeMakefile), for more control during development:
make # builds dependencies, then `epic-analysis` (equivalent to `make all`)
make release # build with optimization enabled
make debug # build with debugging symbols
make clean # clean `epic-analysis` (but not dependencies)
make deps # builds only dependencies
make deps-clean # clean dependencies
make all-clean # clean `epic-analysis` and dependencies
make <dependency> # build a particular `<dependency>`
make <dependency>-clean # clean a particular `<dependency>`Additional build options are available:
INCLUDE_CENTAURO=1 make # build with fastjet plugin Centauro (not included in Delphes by default!)
EXCLUDE_DELPHES=1 make # build without Delphes support; primarily used to expedite CI workflows
INCLUDE_PODIO=1 make # build with support for reading data with PODIOIf you're ready to try the software hands-on, follow the tutorials in
the tutorial/ directory. Otherwise continue reading below.
- for convenience, the wrapper script
deps/run_delphes.shis provided, which runsdelpheson a givenhepmcorhepmc.gzfile, and sets the output file names and the appropriate configuration card- configuration cards are stored in the
deps/delphes_EIC/directory, a mirror ofeic/delphes_EIC - environment must be set first (
source environ.sh) - run
deps/run_delphes.shwith no arguments for usage guide - in the script, you may need to change
exeDelphesto the proper executable, e.g.,DelphesHepMC2orDelphesHepMC3, depending on the format of your generator input - if reading a gunzipped file (
*.hepmc.gz), this script will automatically stream it throughgunzip, so there is no need to decompress beforehand - there are some
hepmcfiles on S3; follow s3tools documentation for scripts and guidance
- configuration cards are stored in the
- the output will be a
TTreestored in arootfile- output files will be placed in
datarec/ - input
hepmc(.gz)files can be kept indatagen/
- output files will be placed in
- The class
AnalysisDelphescontains the event loop for reading Delphes trees- There are several classes which derive from the base
Analysisclass;Analysishandles common setup and final output, whereas the derived classes are tuned to read the upstream data formats
- There are several classes which derive from the base
- See the event loop in
src/AnalysisDelphes.cxxfor details of how the full simulation data are read
- Full simulation files are stored on S3; follow s3tools documentation for scripts and guidance
- In general, everything that can be done in fast simulation can also be done in
full simulation; just replace your usage of
AnalysisDelpheswithAnalysisEpic- In practice, implementations may sometimes be a bit out of sync, where some features exist in fast simulation do not exist in full simulation, or vice versa
- See the event loop in
src/AnalysisEpic.cxxfor details of how the full simulation data are read
- Similar implementation as ePIC full simulation, but use
AnalysisEcceorAnalysisAthena
After simulation, this repository separates the analysis procedure into two stages: (1) the Analysis stage includes the event loop, which processes either fast or full simulation output, kinematics reconstruction, and your specified binning scheme, while (2) the Post-processing stage includes histogram drawing, comparisons, table printouts, and any feature you would like to add.
The two stages are driven by macros. See examples in the tutorial directory,
and follow the README.
- Note: most macros stored in this repository must be executed from the
epic-analysistop directory, not from within their subdirectory, e.g., runroot -b -q tutorial/analysis_template.C; this is because certain library and data directory paths are given as relative paths
In general, these macros will run single-threaded. See HPC documentation for guidance how to run multi-threaded or on a High Performance Computing (HPC) cluster.
- the
Analysisclass is the main class that performs the analysis; it is controlled at the macro level- a typical analysis macro must do the following:
- instantiate an
Analysisderived class (e.g.,AnalysisDelphes) - set up bin schemes and bins (arbitrary specification, see below)
- set any other settings (e.g., a maximum number of events to process, useful for quick tests)
- execute the analysis
- instantiate an
- the input is a config file, which contains a list of files to analyze together with settings such as beam energy and Q2 ranges; see doc/example.config for an example config file and more details
- the output will be a
rootfile, filled withTObjArrays of histograms- each
TObjArraycan be for a different subset of events (bin), e.g., different minimumycuts, so that their histograms can be compared and divided; you can open therootfile in aTBrowserto browse the histograms - the
Histosclass is a container for the histograms, and instances ofHistoswill also be streamed torootfiles, along with the binning scheme (handled by the AdageBinSetclass); downstream post processing code makes use of these streamed objects, rather than theTObjArrays
- each
- derived classes are specific to upstream data structures:
AnalysisDelphesfor Delphes trees (fast simulations)AnalysisAthenafor trees from the DD4hep+Juggler stack (ATHENA full simulations)AnalysisEccefor trees from the Fun4all+EventEvaluator stack (ECCE full simulations)
- the
Kinematicsclass is used to calculate all kinematicsAnalysis-derived classes have one instance ofKinematicsfor generated variables, and another for reconstructed variables, to allow quick comparison (e.g., for resolutions)- calculations are called by
Analysis-derived classes, event-by-event or particle-by-particle or jet-by-jet - see Kinematics Documentation for details of
Kinematics
- a typical analysis macro must do the following:
- The bins may be specified arbitrarily, using the Adage
BinSetandCutDefclasses- see example
analysis_*Cmacros intutorial/ CutDefcan store and apply an arbitrary cut for a single variable, such as:- ranges:
a<x<bor|x-a|<b - minimum or maximum:
x>aorx<a - no cut (useful for "full" bins)
- ranges:
- The set of bins for a variable is defined by
BinSet, a set of bins- These bins can be defined arbitrarily, with the help of the
CutDefclass; you can either:- Automatically define a set of bins, e.g.,
Nbins betweenaandb- Equal width in linear scale
- Equal width in log scale (useful for
xandQ2) - Any custom
TAxis
- Manually define each bin
- example: specific bins in
zandpT:|z-0.3|<0.1and|pT-0.2|<0.05|z-0.7|<0.1and|pT-0.5|<0.05
- example: 3 different
yminima:y>0.05y>0.03y>0(no cut)- note that the arbitrary specification permits bins to overlap, e.g.,
an event with
y=0.1will appear in all three bins
- example: specific bins in
- Automatically define a set of bins, e.g.,
- These bins can be defined arbitrarily, with the help of the
- see example
- Multi-dimensional binning
- Binning in multi-dimensions is allowed, e.g., 3D binning in
x,Q2,z - See Adage documentation for more information on how multi-dimensional binning is handled, as well as the Adage syntax reference
- Be careful of the curse of dimensionality
- Binning in multi-dimensions is allowed, e.g., 3D binning in
- The
Analysisclass is also capable of producing a simpleTTree, handled by theSidisTreeclass, which can also be useful for analysis- As the name suggests, it is a flat tree with a minimal set of variables, specifically needed for SIDIS spin asymmetry analysis
- The tree branches are configured to be compatible with asymmetry analysis code built on the BruFit framework
- There is a switch in
Analysisto enable/disable whether this tree is written
- results processing is handled by the
PostProcessorclass, which does tasks such as printing tables of average values, and drawing ratios of histograms- this class is steered by
postprocess_*.Cmacros, which includes the following:- instantiate
PostProcessor, with the specifiedrootfile that contains output from the analysis macro - loops over bins and perform actions, using Adage
- instantiate
- this class is steered by
- see
src/PostProcessor.handsrc/PostProcessor.cxxfor available post-processing routines; you are welcome to add your own
-
Add your own analysis scripts (macros, etc.) in
macro/, either in the main directory or in a subdirectory ofmacro/.- The
macro/cidirectory is for scripts used by the CI (see.github/workflows/ci.yml); you are welcome to add new analysis scripts to the CI - Make changes in classes such as
PostProcessoras needed
- The
-
Git workflow:
- Contributions are welcome via pull requests and issues reporting; it is recommended to fork this repository or ask to be a contributor
- Continuous Integration (CI) will trigger on pull requests, which will build
and test your contribution
- see
Actionstab for workflows for details - many CI jobs will not work properly from forks (for security), but you may ask to be a contributor
- see
- It is recommended to keep up-to-date with developments by browsing the pull
requests, issues, and viewing the latest commits by going to the
Insightstab, and clickingNetworkto show the commit graph
