- Explore MCP Servers
- ePIC_preTDR_nHCal_DIS_MCPart
Epic Pretdr Nhcal Dis Mcpart
What is Epic Pretdr Nhcal Dis Mcpart
ePIC_preTDR_nHCal_DIS_MCPart is an analysis framework for reconstructed data from the ePIC detector. It is designed to handle various tasks in data processing and analysis, specifically dealing with Monte Carlo particle data.
Use cases
This framework is utilized for conducting detailed analyses on ePIC generated data, including the reconstruction and processing of high-energy physics events to study particle interactions. It is beneficial for researchers analyzing detector performance and particle behavior.
How to use
Users can set up their environment by creating directories for job submissions, configuring S3 access keys, and submitting jobs through Condor. Data can then be processed in an eic-shell environment, followed by merging results and executing macros to generate plots.
Key features
Key features include job submission via Condor for parallel processing, S3 storage access for data handling, and macros for generating visualizations of particle distributions. The toolkit facilitates efficient data analysis workflows.
Where to use
This framework is intended for use in high-energy physics research, particularly in experimental particle physics settings, such as those conducted at electron-ion colliders and similar experimental facilities.
Overview
What is Epic Pretdr Nhcal Dis Mcpart
ePIC_preTDR_nHCal_DIS_MCPart is an analysis framework for reconstructed data from the ePIC detector. It is designed to handle various tasks in data processing and analysis, specifically dealing with Monte Carlo particle data.
Use cases
This framework is utilized for conducting detailed analyses on ePIC generated data, including the reconstruction and processing of high-energy physics events to study particle interactions. It is beneficial for researchers analyzing detector performance and particle behavior.
How to use
Users can set up their environment by creating directories for job submissions, configuring S3 access keys, and submitting jobs through Condor. Data can then be processed in an eic-shell environment, followed by merging results and executing macros to generate plots.
Key features
Key features include job submission via Condor for parallel processing, S3 storage access for data handling, and macros for generating visualizations of particle distributions. The toolkit facilitates efficient data analysis workflows.
Where to use
This framework is intended for use in high-energy physics research, particularly in experimental particle physics settings, such as those conducted at electron-ion colliders and similar experimental facilities.
Content
ePIC_preTDR_nHCal_DIS_MCPart
Analysis of ePIC reconstructed data. Just follow these steps:
Create directories for condor jobs:
mkdirCondor.sh
Create S3setup.sh
to enable access to S3 storage and fill the access keys:
#!/bin/bash
export S3_ACCESS_KEY=<S3_ACCESS_KEY>
export S3_SECRET_KEY=<S3_SECRET_KEY>
Submit jobs:
condor_submit submitRecoAnalysis.job
Get container and run eic-shell
:
curl --location https://get.epic-eic.org | bash -s -- -c jug_xl -v 22.11-main-stable ./eic-shell
When jobs finish merge results:
cd condor/output/
cmod u+x haddmerge.pl . | tee hadd.log
mkdir ../../output/data
cp <merged_output.root> ../../output/data/output_primary_full.root
Run macros:
cd output/data/
root -l -b -q drawMCpartEtaE.C+
root -l -b -q drawMCpartEtaMom.C+
Get the plots in:
ls output/outputSpectra/eta_primary/