ERSAP-CODA provides integration between the ERSAP streaming framework and the CODA (CEBAF Online Data Acquisition) system at Jefferson Lab.
It enables real-time, distributed data processing by connecting ERSAP Java/C++ services into CODA workflows.
To build and run ersap-coda
, you must install:
- Java JDK 8 or higher
- C++14-compliant compiler (e.g., GCC 5+, Clang 3.4+)
- CMake 3.5 or newer
- ZeroMQ (v4.x)
- Protocol Buffers (v3.x)
- Access to Maven local repository (
~/.m2/repository
)
Set the ERSAP installation path before building:
export ERSAP_HOME=$HOME/ersap
export PATH=$ERSAP_HOME/bin:$PATH
export LD_LIBRARY_PATH=$ERSAP_HOME/lib:$LD_LIBRARY_PATH
This variable is critical: all ERSAP components will deploy into $ERSAP_HOME
, and ersap-coda
relies on this layout.
sudo apt update
sudo apt install build-essential cmake \
libzmq5-dev libprotobuf-dev protobuf-compiler \
openjdk-11-jdk
xcode-select --install
brew install cmake zeromq protobuf openjdk@11
Make sure Java is available:
export JAVA_HOME=$(/usr/libexec/java_home)
export PATH=$JAVA_HOME/bin:$PATH
git clone https://github.com/JeffersonLab/xmsg-java.git
cd xmsg-java
./gradlew publishToMavenLocal
./gradlew deploy
git clone https://github.com/JeffersonLab/ersap-java.git
cd ersap-java
./gradlew publishToMavenLocal
./gradlew deploy
Optional IDE integration:
./gradlew cleanIdea idea # IntelliJ
./gradlew cleanEclipse eclipse # Eclipse
git clone https://github.com/JeffersonLab/ersap-cpp.git
cd ersap-cpp
./configure --prefix="$ERSAP_HOME"
make
make install
This installs ERSAP C++ libraries and services to $ERSAP_HOME
.
From the top of the ersap-coda
source directory:
./gradlew publishToMavenLocal
./gradlew deploy
cd main/cpp
mkdir build && cd build
cmake ..
make
make install
This builds the native C++ integration components and installs them into $ERSAP_HOME
.
Ensure binaries and libraries are correctly deployed:
ls $ERSAP_HOME/bin
ls $ERSAP_HOME/lib
You are now ready to configure and run ersap-coda
within your DAQ workflows.
This document provides technical instructions for initializing and managing a real-time CODA data processing pipeline using the ERSAP framework. The pipeline includes integration with the Event Transfer (ET) system, CODA DAQ, and an ERSAP-based stream processing chain.
In a new terminal window, start the ET system which acts as a buffer between the DAQ and ERSAP:
et_start -f /tmp/et_SRO_ERSAP -v -d -n 1000 -s 1000000 -p 23911
Parameters:
-f: Path to the ET file (ET system name)
-n: Number of events in memory
-s: Event size in bytes
-p: TCP port number for the ET server
Use the CODA Run Control GUI or terminal interface to:
Select the spPilot configuration.
Execute the standard run sequence: Configure β Download β Prestart β Go.
In a new terminal window:
cd SRO/ersap
. set_env.sh # Run this once per terminal session
$ERSAP_HOME/bin/ersap_shell
Once inside the ERSAP shell, start the local processing pipeline:
ersap> set threades 1
ersap> run_local
Graceful Exit: Press CTRL+C and wait for a clean shutdown. ERSAP ensures all threads terminate properly.
Forced Termination: Use the following command to forcefully stop the pipeline:
/home/hatdaq/SRO/ersap/kill_ersap
You may stop and restart ERSAP independently from CODA:
If you performed a hard kill, reinitialize using:
ersap_shell
run_local
If you exited gracefully, start a new pipeline session by repeating only:
run_local
To edit the processing pipeline configuration, launch the ERSAP shell:
ersap> edit services
This opens a YAML file describing the service composition:
---
io-services:
reader:
class: org.jlab.ersap.actor.coda.engine.CodaEtSourceEngine
name: Source
writer:
class: org.jlab.ersap.actor.coda.engine.binary.CodaSinkBinaryEngine
name: Sink
services:
- class: org.jlab.ersap.actor.coda.engine.binary.CodaHitFinderBinaryEngine
name: HitFinder
- class: SROPrinterService
name: SoftTrig
lang: cpp
- class: org.jlab.ersap.actor.coda.engine.binary.MultiChannelDigitizerDisplayBinary
name: Histogram
configuration:
io-services:
reader:
et_name: "/tmp/et_SRO_ERSAP"
et_port: 23911
et_station: "ersap"
fifo_capacity: 128
writer:
output_file: "/tmp/output_sro_data.bin"
frames_per_file: 1000
services:
HitFinder:
stream_source: "et"
verbose: "no"
SoftTrig:
max_hits_to_show: 100
Histogram:
hist_bins: 100
hist_min: 100
hist_max: 8000
roc_id: 2
slot: 15
mime-types:
- binary/data-evio
- binary/sro-data
To control histogram rendering in the accumulation mode, adjust parameters under the
Histogram:
hist_bins: 100
hist_min: 100
hist_max: 8000
roc_id: 2
slot: 15
mime-types:
- binary/data-evio
- binary/sro-data
hist_bins: Number of bins
hist_min, hist_max: Range for binning
hist_titles: Histogram channel identifiers
grid_size: Layout matrix (e.g., 4 for 4x4 visualization)
The CODA DAQ system can be restarted independently without affecting the ERSAP pipeline.
It is recommended to monitor pipeline behavior during long-running sessions to ensure data integrity and thread consistency.
The CodaSinkBinaryEngine
now supports automatic file splitting based on the number of frames written:
output_file
: The base path for the output binary file (e.g.,/tmp/output_sro_data.bin
).frames_per_file
: The maximum number of frames to write to each file before rolling over to a new file. When the limit is reached, a new file is created with a numeric postfix (e.g.,output_sro_data-1.bin
,output_sro_data-2.bin
, etc.).
Example configuration:
writer:
output_file: "/tmp/output_sro_data.bin"
frames_per_file: 1000
This ensures that large data sets are automatically split into manageable files for easier handling and post-processing.
The CodaSinkFileEngine
writes CSV files to the $ERSAP_USER_DATA/data/output
directory.
By default, each file uses the prefix out_
in its name.
To customize the output file prefix, use the following command in the ersap-shell
CLI:
set outputFilePrefix
To replay a stored EVIO file in streaming mode, you must use the CodaFileSourceEngine
instead of the CodaEtSourceEngine
.
Below is an example YAML configuration:
reader:
class: org.jlab.ersap.actor.coda.engine.CodaFileSourceEngine
name: Source