Skip to content

Fix sphinx warnings/errors #531

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 16, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Recipes/Along-slope-velocities.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -5543,7 +5543,7 @@
"metadata": {},
"source": [
"### Map of along-slope velocity with bathymetry contours. \n",
"#### On a Large ARE Instance, this should take ~45 seconds"
"**On a Large ARE Instance, this should take ~45 seconds**"
]
},
{
Expand Down
4 changes: 2 additions & 2 deletions Recipes/Geostrophic_Velocities_from_Sea_Level.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -1678,10 +1678,10 @@
"id": "d4520ed1-f69e-460e-8a7c-0abb0589387f",
"metadata": {},
"source": [
"\\begin{eqnarray}\n",
"$$\n",
" u_{g,s} = -\\frac{g}{f}\\frac{\\partial \\eta}{\\partial y} \\quad \\textrm{and} \\quad\n",
" v_{g,s} = \\frac{g}{f}\\frac{\\partial \\eta}{\\partial x}\n",
"\\end{eqnarray}"
"$$"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion Recipes/Nearest_Neighbour_Distance.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -1432,7 +1432,7 @@
"id": "ce17902e-7eae-4a71-aa8a-94703a1bb4ad",
"metadata": {},
"source": [
"The sea ice outputs need some processing before we can start our calculations. You can check this [example](IcePlottingExample.ipynb) for a guide on how to load and plot sea ice data. \n",
"The sea ice outputs need some processing before we can start our calculations. You can check this [example](Sea_Ice_Coordinates.ipynb) for a guide on how to load and plot sea ice data. \n",
" \n",
"We will follow these processing steps:\n",
"1. Correct time dimension values by subtracting 12 hours,\n",
Expand Down
6 changes: 2 additions & 4 deletions Tutorials/Make_Your_Own_Intake_Datastore.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -89,8 +89,7 @@
"id": "bc200ca9-5cba-4413-a550-bd0a4f1b54bd",
"metadata": {},
"source": [
"# Building the datastore\n",
"___"
"## Building the datastore"
]
},
{
Expand Down Expand Up @@ -233,7 +232,7 @@
"tags": []
},
"source": [
"# Using your datastore"
"## Using your datastore"
]
},
{
Expand Down Expand Up @@ -448,7 +447,6 @@
"metadata": {},
"source": [
"# 2. The convenience method: `use_datastore`\n",
"___\n",
"\n",
"\n",
"With the `access-nri-intake` v1.1.1 release, it is now possible to build and load datastores, all in a single step.\n",
Expand Down
8 changes: 4 additions & 4 deletions Tutorials/intake_to_dask_efficiently_chunking.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -4446,7 +4446,7 @@
"source": [
"## So even with optimised chunks that are about the right size, we still didn't really improve things a great deal.\n",
"\n",
"#### Sometimes, getting the chunks right can be more of an art than a science.\n",
"**Sometimes, getting the chunks right can be more of an art than a science.**\n",
"\n",
"- We tried to follow the 300MiB chunk rule of thumb above, and slowed down loading our dataset by 50% - so the warnings about degrading performance were right. This is because the chunks we chose weren't integer multiples of the disk chunks. However, without `validate_chunkspec`, we would have had no (easy) way of knowing this!\n",
"- If we wanted to throw away a large fraction of a dimension - for example, if we were only interested in data in the Southern Ocean, we could instead have tried to split our chunks up on latitude. That way, when we select a subset of data, we can throw away a lot of chunks - without having to extract a subset of their data first.\n",
Expand Down Expand Up @@ -10398,12 +10398,12 @@
"___\n",
"# Part 2: Combining coordinates\n",
"\n",
"### Unfortunately, that didn't seem to help much - it might have even made things a bit slower. \n",
"**Unfortunately, that didn't seem to help much - it might have even made things a bit slower.**\n",
"- So what is the issue?\n",
"\n",
"It turns our that xarray is checking that all our coordinates are consistent. Doing that with the 2D arrays `(ni,nj)` can be really quite slow. Fortunately, we have options to turn these checks off too, if we are confident we don't need them. In this instance, they come from a consistent model grid, so we know we can get rid of them.\n",
"\n",
"#### We don't use `xarray_open_kwargs` for this: we use `xarray_combine_by_kwargs`\n",
"**We don't use** `xarray_open_kwargs` **for this: we use** `xarray_combine_by_kwargs`\n",
"\n",
"Lets see if we can beat four minutes...\n",
"___\n",
Expand Down Expand Up @@ -11293,7 +11293,7 @@
"id": "940268d8-4f00-41e9-b3cd-ab041ef186b5",
"metadata": {},
"source": [
"#### So this actually slowed things down pretty substantially - that's not ideal!\n",
"**So this actually slowed things down pretty substantially - that's not ideal!**\n",
"\n",
"Step 2: Let's set the `compat` flag to `override`. This skips a bunch of checks that slow things down a bunch.\n",
"Note however: if we don't set `'datavars' : 'minimal'` and `'coords' : 'minimal'`, this can throw an error.\n"
Expand Down