-
Notifications
You must be signed in to change notification settings - Fork 1
T&S | Surface and Bottom Errors
PYTHON FILE: validate_ts_en4_hourly.py
For looking directly at surface and bottom errors and CRPS, the scripts use hourly output and compare them to EN4 data.
Required Observations: EN4 profile data in monthly files
Required Model Output: Instant hourly temperature and salinity in monthly files
These scripts will:
- Extract nearest model SST, SSS, SBT and SBS at EN4 locations and save to file.
- Define observed surface and bottom variables as means over some depth.
- Calculate surface and bottom errors.
- Calculate surface and bottom CRPS.
- Average errors and CRPS in regional bins.
- Smooth spatial data using a radial mean method.
An example script can be used as a template: sample_analyse_ts_en4_hourly.py
- Start by importing the python file:
import validate_ts_en4_hourly
-
Define your file paths, regional masks (if using) and model run name (see sample script).
-
Run
analyse_and_extract()
class analyse_and_extract():
def __init__(self, fn_nemo_data, fn_nemo_domain, fn_en4, fn_out,
surface_def=2, bottom_def=10,
nemo_chunks={'time_counter':50},
bathymetry = None):
'''
Analysis of hourly model temperature and salinity output with EN4 profiles.
INPUT
fn_nemo_data (str) : Absolute path to output files containing hourly data
fn_nemo_domain (str) : Absolute path to nemo domain file
fn_en4 (str) : Absolute path to EN4 profile files
fn_out (str) : Absolute path to desired output file (1 file)
surface_def (float) : Definition of the 'surface' in metres - for averaging
bottom_def (float) : Definition of the 'bottom' in metres - for averaging
bathymetry (2Darray) : Bathymetry data to use for bottom definition.
If not supplied, only surface will be analysed
'''
This does the extraction, error calculation and surface/bottom averaging of EN4 data. These new derived variables and extracted values will be saved to an analysis file (fn_out
).
- Do regional averaging by calling
analyse_regional()
:
def analyse_regional(fn_stats, fn_nemo_domain, fn_out,
regional_masks=[], region_names=[]):
'''
A routine for averaging the hourly analysis into regional subdomains.
This routine will also average into seasons as well as over all time.
The resulting output dataset will have dimension (region, season)
INPUTS
fn_stats (str) : Absolute path to analysis output from analyse_ts_hourly_en4()
fn_nemo_domain (str) : Absolute path to NEMO domain_cfg.nc
fn_out (str) : Absolute path to desired output file
regional_masks (list) : List of 2D boolean arrays. Where true, this is the region
used for averaging. The whole domain will be added to the
list, or will be the only domain if left to be default.
region_names (list) : List of strings, the names used for each region.
'''
Given some region definitions (which are boolean arrays), this will calculate average errors and CRPS for specified regions.
COAsT already contains some routines for helping with the creation of these masks. These tools are in the MASK_MAKER
class. See the COAsT website for more information.
- If spatial smoothing is desired, the
radius_means
routine can be used on the analysis file:
class radius_means():
def __init__(self, fn_stats, fn_out, grid_lon, grid_lat,
radius=10):
'''
For a set of specified longitudes and latitudes, will calculate the mean of all
statistics within a specified radius. This will have a smoothing effect on
profile data (horizontal smoothing).
INPUTS
fn_stats (str) : Absolute path to statistics/extracted data file
fn_out (str) : Absolute path to desired output file
grid_lon (array): 1D array containing longitude values of grid
grid_lat (array): 1D array contain latitude values of grid
radius (float) : Radius over which to average in km
'''
- Model output file should contain variables with names: votemper_top, votemper_bot, vosaline_top, vosaline_bot.
- These scripts use the time variable
time_instant
, nottime_counter
.