Setting the num_of_obs_per_detector
according to the length of the noise correlation
#421
anand-avinash
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
In the simulations that @martamonelli has been doing, as well as for some future simulations, we are assuming that the 1/f noise has a finite correlation length. To enforce this, we need to manually create timeline chunks of appropriate size, and supply the number of chunks as a parameter to
num_of_obs_per_detector
orn_blocks_time
insim.create_observations()
. This approach ensures that the timeline associated with a given observation instance contains exactly one correlation length of time-ordered data (TOD).Should we consider implementing a feature in LBS that automates this segmentation of timelines? For example, we could add a parameter
noise_corr_len_s
(s for seconds) to.create_observations()
. Using this parameter, the entire timeline can be divided into a certain number of chunks, and the parametersnum_of_obs_per_detector
(and/orn_blocks_time
) can be set accordingly. We might also need this for our e2e pipeline.Beta Was this translation helpful? Give feedback.
All reactions