Replies: 2 comments 2 replies
-
Hi @TomMBIO, I don't think you need |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hi Erik, When I didn't do this, the particle files appeared to be overwritten with each iteration. But I will double check this |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone,
I have made some progress with trying to run multiple particle sets, each beginning on a subsequent week:
`
iterations = range(26) # Simulate for 26 weeks
psets = []
cohort = 0
release_time = datetime.datetime(2022, 3, 1, 00, 00, 0)
for i in iterations:
cohort_array = np.full(n_sites, cohort) #create an array of cohort values for each site
# Create a new ParticleSet for each iteration with updated release time and cohort ID
pset = ParticleSet(
fieldset=fieldset,
pclass=Particle,
lat=release_lat,
lon=release_lon,
depth = None,
time=release_time,
cohort=cohort_array
)
`
Just to clarify, I am doing this instead of using repeatdt because I wish to assign each release event a cohort number.
It appears write_mode='append' does not have functionality with parcels, and I am a bit stuck. One solution could be to assign each output its own file, and combine them in post, but this feels a bit unsophisticated. Do you have any ways to append trajectories to the same xarr file automatically? I assume using multiple particlesets for an experiment is common practice but not sure.
Thank you,
Tom
Beta Was this translation helpful? Give feedback.
All reactions