You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Dear all, I am using BRAN2020 data (b grid) to run ocean parcels. I bump up two questions:
Question 1. Vertical velocity is at the down edge of the grid cell but parcels only takes the upper edge. My solution is to put all the data as xarray dataset and set the 'interp_method' after 'FieldSet.from_xarray_dataset' Please find the script below to see how I solve that. BUT I strongly feel it is not correct because of my second question.
Question 2. Everything (fieldset, ParticleSet, Kernel) is successfully set, no error pop up. BUT when I run pset.execute(...) the model is frozen at the beginning and never continue running (please find the error codes below). And I don't where number 41 comes from.
I appreciate every single suggestion and help!
Supporting code/error messages
Please find all my codes below
importparcelsimportxarrayasxrimportnumpyasnpfromparcelsimport (FieldSet, ParticleSet, Variable,
JITParticle, AdvectionRK4_3D)
fromdatetimeimporttimedeltaasdeltaimportmathimportargparseimportsubprocessimportwarningswarnings.simplefilter('ignore')
homedir='E:/langrangian/BRAN2020'#outfile_name='experiment3' # extract the data and make a xarray datasett_files= ['https://thredds.nci.org.au/thredds/dodsC/gb6/BRAN/BRAN2020/daily/ocean_temp_1998_01.nc']
s_files= ['https://thredds.nci.org.au/thredds/dodsC/gb6/BRAN/BRAN2020/daily/ocean_salt_1998_01.nc']
mld_files= ['https://thredds.nci.org.au/thredds/dodsC/gb6/BRAN/BRAN2020/daily/ocean_mld_1998_01.nc']
u_files= ['https://thredds.nci.org.au/thredds/dodsC/gb6/BRAN/BRAN2020/daily/ocean_u_1998_01.nc']
v_files= ['https://thredds.nci.org.au/thredds/dodsC/gb6/BRAN/BRAN2020/daily/ocean_v_1998_01.nc']
w_files= ['https://thredds.nci.org.au/thredds/dodsC/gb6/BRAN/BRAN2020/daily/ocean_w_1998_01.nc']
ds_t=xr.open_mfdataset(t_files, engine="netcdf4")
ds_s=xr.open_mfdataset(s_files, engine="netcdf4")
ds_mld=xr.open_mfdataset(mld_files, engine="netcdf4")
ds_u=xr.open_mfdataset(u_files, engine="netcdf4")
ds_v=xr.open_mfdataset(v_files, engine="netcdf4")
ds_w=xr.open_mfdataset(w_files, engine="netcdf4")
print(ds_mld)
x0,x1=0,35y0,y1=-45, -30# Extract data my_t=ds_t['temp'].sel(xt_ocean=slice(x0,x1), yt_ocean=slice(y0,y1))
my_s=ds_s['salt'].sel(xt_ocean=slice(x0,x1), yt_ocean=slice(y0,y1))
my_mld=ds_mld['mld'].sel(xt_ocean=slice(x0,x1), yt_ocean=slice(y0,y1))
my_u=ds_u['u'].sel(xu_ocean=slice(x0,x1), yu_ocean=slice(y0,y1))
my_v=ds_v['v'].sel(xu_ocean=slice(x0,x1), yu_ocean=slice(y0,y1))
my_w=ds_w['w'].sel(xt_ocean=slice(x0,x1), yt_ocean=slice(y0,y1))
# Make xarray.Datasetds=xr.Dataset(
{
'temp': (["Time", "st_ocean", "yt_ocean", "xt_ocean"], my_t.data),
'salt': (["Time", "st_ocean", "yt_ocean", "xt_ocean"], my_s.data),
'mld': (["Time", "yt_ocean", "xt_ocean"], my_mld.data),
'u': (["Time", "st_ocean", "yu_ocean", "xu_ocean"], my_u.data),
'v': (["Time", "st_ocean", "yu_ocean", "xu_ocean"], my_v.data),
'w': (["Time", "sw_ocean", "yt_ocean", "xt_ocean"], my_w.data),
},
coords={
"Time": my_t["Time"].data,
"st_ocean": my_t["st_ocean"].data,
"xt_ocean": my_t["xt_ocean"].data,
"yt_ocean": my_t["yt_ocean"].data,
"xu_ocean": my_u["xu_ocean"].data,
"yu_ocean": my_u["yu_ocean"].data,
"sw_ocean": my_w["sw_ocean"].data,
}
)
variables= {'U': 'u',
'V': 'v',
'W': 'w',
'MLD': 'mld',
'T':'temp',
'S':'salt'
}
dimensions_xarray= {'U': {'lon': 'xu_ocean', 'lat': 'yu_ocean', 'depth': 'st_ocean','time':'Time'},
'V': {'lon': 'xu_ocean', 'lat': 'yu_ocean', 'depth': 'st_ocean','time':'Time'},
'W': {'lon': 'xt_ocean', 'lat': 'yt_ocean', 'depth': 'sw_ocean','time':'Time'},
'MLD': {'lon': 'xt_ocean', 'lat': 'yt_ocean','time':'Time'},
'T':{'lon': 'xt_ocean', 'lat': 'yt_ocean', 'depth': 'st_ocean','time':'Time'},
'S':{'lon': 'xt_ocean', 'lat': 'yt_ocean', 'depth': 'st_ocean','time':'Time'}
}
fieldset=parcels.FieldSet.from_xarray_dataset(ds,variables,dimensions_xarray) ####### from_b_grid ###### # set interp method to bgridfieldset.W.interp_method='bgrid_w_velocity'fieldset.U.interp_method='bgrid_velocity'fieldset.V.interp_method='bgrid_velocity'fieldset.MLD.interp_method='bgrid_tracer'fieldset.T.interp_method='bgrid_tracer'fieldset.S.interp_method='bgrid_tracer'# Temporary coordinatext_ocean=my_t.xt_ocean.valuestarget=30yt_ocean=my_t.yt_ocean.valuestarget2=-42st_ocean=my_u.st_ocean.valuestarget3=15# Find the index of the nearest valuenearest_index=np.abs(xt_ocean-target).argmin()
nearest_index2=np.abs(yt_ocean-target2).argmin()
nearest_index3=np.abs(st_ocean-target3).argmin()
# Get the nearest valuelon=xt_ocean[nearest_index]
lat=yt_ocean[nearest_index2]
depth=st_ocean[nearest_index3]
#-------------Kernel setting ------------#classSampleParticle(JITParticle):
"""Define a new particle class that samples T and S"""T=Variable('T', dtype=np.float32, initial=np.nan)
S=Variable('S', dtype=np.float32, initial=np.nan)
# Define the kernel function (NO DECORATOR NEEDED)defSample_T_S(particle, fieldset, time):
"""Custom function that samples T and S at particle location"""particle.temp=fieldset.T[particle.time, particle.depth,
particle.lat, particle.lon]
particle.salt=fieldset.S[particle.time, particle.depth,
particle.lat, particle.lon]
# Define the delete functiondefDeleteParticle(particle, fieldset, time):
"""Delete particle when conditions are met"""particle.delete()
#-----------------------------------------------------#pset=parcels.ParticleSet(
fieldset=fieldset,pclass=SampleParticle, lon=lon, lat=lat, depth=depth)
outfile_name='group2_15m_depth'pfile=pset.ParticleFile(name=homedir+outfile_name+'.zarr', outputdt=delta(hours=24)) #output trajectory position every 5 daysfrommy_kernelsimport (Sample_T_S, DeleteParticle)
kernels=pset.Kernel(AdvectionRK4_3D) +pset.Kernel(Sample_T_S) +pset.Kernel(DeleteParticle) #choose advection scheme and include kernelspset.execute(
kernels,
runtime=delta(days=30, hours=12),
dt=delta(minutes=5),
output_file=pfile,
delete_cfiles=True
)
outfile=xr.open_zarr(homedir+outfile_name+'.zarr')
outfile.to_netcdf(homedir+outfile_name+'.nc')
This is the error. The model freeze at '0%'. There is actual an output file. But there is only one group of data in it which are all the values at releasing point.
>>> outfile=xr.open_zarr(homedir+outfile_name+'.zarr')
INFO: Output files are stored in E:/langrangian/BRAN2020group2_15m_depth.zarr.
0%| | 0/2635200.0 [02:57<?, ?it/s]
41
>>>
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
Question
Question
Dear all, I am using BRAN2020 data (b grid) to run ocean parcels. I bump up two questions:
Question 1. Vertical velocity is at the down edge of the grid cell but parcels only takes the upper edge. My solution is to put all the data as xarray dataset and set the 'interp_method' after 'FieldSet.from_xarray_dataset' Please find the script below to see how I solve that. BUT I strongly feel it is not correct because of my second question.
Question 2. Everything (fieldset, ParticleSet, Kernel) is successfully set, no error pop up. BUT when I run pset.execute(...) the model is frozen at the beginning and never continue running (please find the error codes below). And I don't where number 41 comes from.
I appreciate every single suggestion and help!
Supporting code/error messages
Please find all my codes below
This is the error. The model freeze at '0%'. There is actual an output file. But there is only one group of data in it which are all the values at releasing point.
Please find all my data information below
Beta Was this translation helpful? Give feedback.
All reactions