Skip to content

Commit 7ac3158

Browse files
committed
fix broken links
1 parent 5584f95 commit 7ac3158

File tree

7 files changed

+97
-97
lines changed

7 files changed

+97
-97
lines changed

README.md

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -32,15 +32,15 @@ data using remote clusters.
3232
`conda install -c conda-forge mlforecast`
3333

3434
For more detailed instructions you can refer to the [installation
35-
page](https://nixtla.github.io/mlforecast/docs/getting-started/install.html).
35+
page](https://nixtlaverse.nixtla.io/mlforecast/docs/getting-started/install).
3636

3737
## Quick Start
3838

3939
**Get Started with this [quick
40-
guide](https://nixtla.github.io/mlforecast/docs/getting-started/quick_start_local.html).**
40+
guide](https://nixtlaverse.github.io/mlforecast/docs/getting-started/quick_start_local.html).**
4141

4242
**Follow this [end-to-end
43-
walkthrough](https://nixtla.github.io/mlforecast/docs/getting-started/end_to_end_walkthrough.html)
43+
walkthrough](https://nixtlaverse.nixtla.io/mlforecast/docs/getting-started/end_to_end_walkthrough)
4444
for best practices.**
4545

4646
### Videos
@@ -61,7 +61,7 @@ for best practices.**
6161
Current Python alternatives for machine learning models are slow,
6262
inaccurate and don’t scale well. So we created a library that can be
6363
used to forecast in production environments.
64-
[`MLForecast`](https://Nixtla.github.io/mlforecast/forecast.html#mlforecast)
64+
[`MLForecast`](https://nixtlaverse.nixtla.io/mlforecast/forecast#class-mlforecast)
6565
includes efficient feature engineering to train any machine learning
6666
model (with `fit` and `predict` methods such as
6767
[`sklearn`](https://scikit-learn.org/stable/)) to fit millions of time
@@ -83,36 +83,36 @@ Missing something? Please open an issue or write us in
8383
## Examples and Guides
8484

8585
📚 [End to End
86-
Walkthrough](https://nixtla.github.io/mlforecast/docs/getting-started/end_to_end_walkthrough.html):
86+
Walkthrough](https://nixtlaverse.nixtla.io/mlforecast/docs/getting-started/end_to_end_walkthrough):
8787
model training, evaluation and selection for multiple time series.
8888

8989
🔎 [Probabilistic
90-
Forecasting](https://nixtla.github.io/mlforecast/docs/how-to-guides/prediction_intervals.html):
90+
Forecasting](https://nixtlaverse.nixtla.io/mlforecast/docs/tutorials/prediction_intervals_in_forecasting_models):
9191
use Conformal Prediction to produce prediciton intervals.
9292

9393
👩‍🔬 [Cross
94-
Validation](https://nixtla.github.io/mlforecast/docs/how-to-guides/cross_validation.html):
94+
Validation](https://nixtlaverse.nixtla.io/mlforecast/docs/how-to-guides/cross_validation):
9595
robust model’s performance evaluation.
9696

9797
🔌 [Predict Demand
98-
Peaks](https://nixtla.github.io/mlforecast/docs/tutorials/electricity_peak_forecasting.html):
98+
Peaks](https://nixtlaverse.nixtla.io/mlforecast/docs/tutorials/electricity_peak_forecasting):
9999
electricity load forecasting for detecting daily peaks and reducing
100100
electric bills.
101101

102102
📈 [Transfer
103-
Learning](https://nixtla.github.io/mlforecast/docs/how-to-guides/transfer_learning.html):
103+
Learning](https://nixtlaverse.nixtla.io/mlforecast/docs/how-to-guides/transfer_learning):
104104
pretrain a model using a set of time series and then predict another one
105105
using that pretrained model.
106106

107107
🌡️ [Distributed
108-
Training](https://nixtla.github.io/mlforecast/docs/getting-started/quick_start_distributed.html):
108+
Training](https://nixtlaverse.nixtla.io/mlforecast/docs/getting-started/quick_start_distributed):
109109
use a Dask, Ray or Spark cluster to train models at scale.
110110

111111
## How to use
112112

113113
The following provides a very basic overview, for a more detailed
114114
description see the
115-
[documentation](https://nixtla.github.io/mlforecast/).
115+
[documentation](https://nixtlaverse.nixtla.io/mlforecast/).
116116

117117
### Data setup
118118

@@ -165,7 +165,7 @@ models = [
165165
### Forecast object
166166

167167
Now instantiate an
168-
[`MLForecast`](https://Nixtla.github.io/mlforecast/forecast.html#mlforecast)
168+
[`MLForecast`](https://nixtlaverse.nixtla.io/mlforecast/forecast#class-mlforecast)
169169
object with the models and the features that you want to use. The
170170
features can be lags, transformations on the lags and date features. You
171171
can also define transformations to apply to the target before fitting,

nbs/docs/how-to-guides/cross_validation.ipynb

Lines changed: 13 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717
"\n",
1818
"## Prerequesites\n",
1919
"\n",
20-
"This tutorial assumes basic familiarity with `MLForecast`. For a minimal example visit the [Quick Start](quick_start_local.html) \n",
20+
"This tutorial assumes basic familiarity with `MLForecast`. For a minimal example visit the [Quick Start](https://nixtlaverse.nixtla.io/mlforecast/docs/getting-started/quick_start_local)\n",
2121
":::"
2222
]
2323
},
@@ -36,7 +36,7 @@
3636
"cell_type": "markdown",
3737
"metadata": {},
3838
"source": [
39-
"[MLForecast](https://nixtla.github.io/mlforecast/) has an implementation of time series cross-validation that is fast and easy to use. This implementation makes cross-validation a efficient operation, which makes it less time-consuming. In this notebook, we'll use it on a subset of the [M4 Competition](https://www.sciencedirect.com/science/article/pii/S0169207019301128) hourly dataset. "
39+
"[MLForecast](https://nixtlaverse.nixtla.io/mlforecast/) has an implementation of time series cross-validation that is fast and easy to use. This implementation makes cross-validation a efficient operation, which makes it less time-consuming. In this notebook, we'll use it on a subset of the [M4 Competition](https://www.sciencedirect.com/science/article/pii/S0169207019301128) hourly dataset. "
4040
]
4141
},
4242
{
@@ -72,7 +72,7 @@
7272
"cell_type": "markdown",
7373
"metadata": {},
7474
"source": [
75-
"We assume that you have `MLForecast` already installed. If not, check this guide for instructions on [how to install MLForecast](../getting-started/install.html)."
75+
"We assume that you have `MLForecast` already installed. If not, check this guide for instructions on [how to install MLForecast](https://nixtlaverse.nixtla.io/mlforecast/docs/getting-started/install)"
7676
]
7777
},
7878
{
@@ -88,11 +88,11 @@
8888
"metadata": {},
8989
"outputs": [],
9090
"source": [
91-
"import pandas as pd \n",
91+
"import pandas as pd\n",
9292
"\n",
9393
"from utilsforecast.plotting import plot_series\n",
9494
"\n",
95-
"from mlforecast import MLForecast # required to instantiate MLForecast object and use cross-validation method "
95+
"from mlforecast import MLForecast # required to instantiate MLForecast object and use cross-validation method"
9696
]
9797
},
9898
{
@@ -190,8 +190,8 @@
190190
}
191191
],
192192
"source": [
193-
"Y_df = pd.read_csv('https://datasets-nixtla.s3.amazonaws.com/m4-hourly.csv') # load the data \n",
194-
"Y_df.head() "
193+
"Y_df = pd.read_csv('https://datasets-nixtla.s3.amazonaws.com/m4-hourly.csv') # load the data\n",
194+
"Y_df.head()"
195195
]
196196
},
197197
{
@@ -252,9 +252,9 @@
252252
"cell_type": "markdown",
253253
"metadata": {},
254254
"source": [
255-
"For this example, we'll use LightGBM. We first need to import it and then we need to instantiate a new [MLForecast](../../forecast.html#mlforecast) object. \n",
255+
"For this example, we'll use LightGBM. We first need to import it and then we need to instantiate a new [MLForecast](https://nixtlaverse.nixtla.io/mlforecast/forecast#class-mlforecast) object. \n",
256256
"\n",
257-
"In this example, we are only using `differences` and `lags` to produce features. See [the full documentation](https://nixtla.github.io/mlforecast) to see all available features.\n",
257+
"In this example, we are only using `differences` and `lags` to produce features. See [the full documentation](https://nixtlaverse.nixtla.io/mlforecast) to see all available features.\n",
258258
"\n",
259259
"Any settings are passed into the constructor. Then you call its `fit` method and pass in the historical data frame `df`. "
260260
]
@@ -278,8 +278,8 @@
278278
"models = [lgb.LGBMRegressor(verbosity=-1)]\n",
279279
"\n",
280280
"mlf = MLForecast(\n",
281-
" models=models, \n",
282-
" freq=1,# our series have integer timestamps, so we'll just add 1 in every timeste, \n",
281+
" models=models,\n",
282+
" freq=1,# our series have integer timestamps, so we'll just add 1 in every timeste,\n",
283283
" target_transforms=[Differences([24])],\n",
284284
" lags=range(1, 25)\n",
285285
")"
@@ -296,7 +296,7 @@
296296
"cell_type": "markdown",
297297
"metadata": {},
298298
"source": [
299-
"Once the `MLForecast` object has been instantiated, we can use the [cross_validation method](../../forecast.html#mlforecast.cross_validation)."
299+
"Once the `MLForecast` object has been instantiated, we can use the [cross_validation method](https://nixtlaverse.nixtla.io/mlforecast/forecast#method-cross-validation)"
300300
]
301301
},
302302
{
@@ -504,7 +504,7 @@
504504
"outputs": [],
505505
"source": [
506506
"from utilsforecast.evaluation import evaluate\n",
507-
"from utilsforecast.losses import rmse "
507+
"from utilsforecast.losses import rmse"
508508
]
509509
},
510510
{

nbs/docs/how-to-guides/prediction_intervals.ipynb

Lines changed: 24 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@
1919
"\n",
2020
"## Prerequesites\n",
2121
"\n",
22-
"This tutorial assumes basic familiarity with MLForecast. For a minimal example visit the [Quick Start](https://nixtla.github.io/mlforecast/docs/quick_start_local.html) \n",
22+
"This tutorial assumes basic familiarity with MLForecast. For a minimal example visit the [Quick Start](https://nixtlaverse.nixtla.io/mlforecast/docs/getting-started/quick_start_local)\n",
2323
":::"
2424
]
2525
},
@@ -323,7 +323,7 @@
323323
"metadata": {},
324324
"outputs": [],
325325
"source": [
326-
"n_series = 8 \n",
326+
"n_series = 8\n",
327327
"uids = train['unique_id'].unique()[:n_series] # select first n_series of the dataset\n",
328328
"train = train.query('unique_id in @uids')\n",
329329
"test = test.query('unique_id in @uids')"
@@ -415,7 +415,7 @@
415415
"metadata": {},
416416
"outputs": [],
417417
"source": [
418-
"# Create a list of models and instantiation parameters \n",
418+
"# Create a list of models and instantiation parameters\n",
419419
"models = [\n",
420420
" KNeighborsRegressor(),\n",
421421
" Lasso(),\n",
@@ -765,11 +765,11 @@
765765
"outputs": [],
766766
"source": [
767767
"fig = plot_series(\n",
768-
" train, \n",
769-
" test, \n",
770-
" plot_random=False, \n",
771-
" models=['KNeighborsRegressor'], \n",
772-
" level=levels, \n",
768+
" train,\n",
769+
" test,\n",
770+
" plot_random=False,\n",
771+
" models=['KNeighborsRegressor'],\n",
772+
" level=levels,\n",
773773
" max_insample_length=48\n",
774774
")"
775775
]
@@ -809,11 +809,11 @@
809809
"outputs": [],
810810
"source": [
811811
"fig = plot_series(\n",
812-
" train, \n",
813-
" test, \n",
814-
" plot_random=False, \n",
812+
" train,\n",
813+
" test,\n",
814+
" plot_random=False,\n",
815815
" models=['Lasso'],\n",
816-
" level=levels, \n",
816+
" level=levels,\n",
817817
" max_insample_length=48\n",
818818
")"
819819
]
@@ -853,11 +853,11 @@
853853
"outputs": [],
854854
"source": [
855855
"fig = plot_series(\n",
856-
" train, \n",
857-
" test, \n",
858-
" plot_random=False, \n",
856+
" train,\n",
857+
" test,\n",
858+
" plot_random=False,\n",
859859
" models=['LinearRegression'],\n",
860-
" level=levels, \n",
860+
" level=levels,\n",
861861
" max_insample_length=48\n",
862862
")"
863863
]
@@ -897,11 +897,11 @@
897897
"outputs": [],
898898
"source": [
899899
"fig = plot_series(\n",
900-
" train, \n",
901-
" test, \n",
902-
" plot_random=False, \n",
900+
" train,\n",
901+
" test,\n",
902+
" plot_random=False,\n",
903903
" models=['MLPRegressor'],\n",
904-
" level=levels, \n",
904+
" level=levels,\n",
905905
" max_insample_length=48\n",
906906
")"
907907
]
@@ -941,11 +941,11 @@
941941
"outputs": [],
942942
"source": [
943943
"fig = plot_series(\n",
944-
" train, \n",
945-
" test, \n",
946-
" plot_random=False, \n",
944+
" train,\n",
945+
" test,\n",
946+
" plot_random=False,\n",
947947
" models=['Ridge'],\n",
948-
" level=levels, \n",
948+
" level=levels,\n",
949949
" max_insample_length=48\n",
950950
")"
951951
]

nbs/docs/how-to-guides/transfer_learning.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -153,7 +153,7 @@
153153
"- `differences`: Differences to take of the target before computing the features. These are restored at the forecasting step.\n",
154154
"- `lags`: Lags of the target to use as features.\n",
155155
"\n",
156-
"In this example, we are only using `differences` and `lags` to produce features. See [the full documentation](https://nixtla.github.io/mlforecast/forecast.html) to see all available features.\n",
156+
"In this example, we are only using `differences` and `lags` to produce features. See [the full documentation](https://nixtlaverse.nixtla.io/mlforecast/forecast) to see all available features.\n",
157157
"\n",
158158
"Any settings are passed into the constructor. Then you call its `fit` method and pass in the historical data frame `Y_df_M3`. "
159159
]
@@ -165,7 +165,7 @@
165165
"outputs": [],
166166
"source": [
167167
"fcst = MLForecast(\n",
168-
" models=models, \n",
168+
" models=models,\n",
169169
" lags=range(1, 13),\n",
170170
" freq='MS',\n",
171171
" target_transforms=[Differences([1, 12])],\n",
@@ -190,7 +190,7 @@
190190
"source": [
191191
"Y_df = pd.read_csv('https://datasets-nixtla.s3.amazonaws.com/air-passengers.csv', parse_dates=['ds'])\n",
192192
"\n",
193-
"# We define the train df. \n",
193+
"# We define the train df.\n",
194194
"Y_train_df = Y_df[Y_df.ds<='1959-12-31'] # 132 train\n",
195195
"Y_test_df = Y_df[Y_df.ds>'1959-12-31'] # 12 test"
196196
]

0 commit comments

Comments
 (0)