Skip to content

Commit b285702

Browse files
authored
Update docs links; Remove ModelSchema usage (#316) (#320)
1 parent 004d62a commit b285702

19 files changed

+33
-139
lines changed

batch-ai-systems/fraud_batch/1_fraud_batch_feature_pipeline.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -363,7 +363,7 @@
363363
"source": [
364364
"### <span style=\"color:#ff5f27;\"> 🪄 Creating Feature Groups </span>\n",
365365
"\n",
366-
"A [feature group](https://docs.hopsworks.ai/3.0/concepts/fs/feature_group/fg_overview/) can be seen as a collection of conceptually related features. In this case, you will create a feature group for the transaction data and a feature group for the windowed aggregations on the transaction data. Both will have `cc_num` as primary key, which will allow you to join them when creating a dataset in the next tutorial.\n",
366+
"A [feature group](https://docs.hopsworks.ai/latest/concepts/fs/feature_group/fg_overview/) can be seen as a collection of conceptually related features. In this case, you will create a feature group for the transaction data and a feature group for the windowed aggregations on the transaction data. Both will have `cc_num` as primary key, which will allow you to join them when creating a dataset in the next tutorial.\n",
367367
"\n",
368368
"Feature groups can also be used to define a namespace for features. For instance, in a real-life setting you would likely want to experiment with different window lengths. In that case, you can create feature groups with identical schema for each window length. \n",
369369
"\n",

integrations/big_query/big_query_feature_pipeline_external.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,9 +7,9 @@
77
"source": [
88
"# <span style='color:#ff5f27'> 👨🏻‍🏫 BigQuery External Feature Group Creation</span>\n",
99
"\n",
10-
"Follow this [guide](https://docs.hopsworks.ai/3.0/user_guides/fs/storage_connector/creation/bigquery/) to set up a connection to BigQuery.\n",
10+
"Follow this [guide](https://docs.hopsworks.ai/latest/user_guides/fs/storage_connector/creation/bigquery/) to set up a connection to BigQuery.\n",
1111
"\n",
12-
"In addition, you can read about [External Feature Groups](https://docs.hopsworks.ai/3.0/user_guides/fs/feature_group/create_external/)."
12+
"In addition, you can read about [External Feature Groups](https://docs.hopsworks.ai/latest/user_guides/fs/feature_group/create_external/)."
1313
]
1414
},
1515
{

integrations/big_query/big_query_feature_pipeline_online_external.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,9 +7,9 @@
77
"source": [
88
"# <span style='color:#ff5f27'> 👨🏻‍🏫 BigQuery Online External Feature Group Creation</span>\n",
99
"\n",
10-
"Follow this [guide](https://docs.hopsworks.ai/3.0/user_guides/fs/storage_connector/creation/bigquery/) to set up a connection to BigQuery.\n",
10+
"Follow this [guide](https://docs.hopsworks.ai/latest/user_guides/fs/storage_connector/creation/bigquery/) to set up a connection to BigQuery.\n",
1111
"\n",
12-
"In addition, you can read about [External Feature Groups](https://docs.hopsworks.ai/3.0/user_guides/fs/feature_group/create_external/)."
12+
"In addition, you can read about [External Feature Groups](https://docs.hopsworks.ai/latest/user_guides/fs/feature_group/create_external/)."
1313
]
1414
},
1515
{

integrations/bytewax/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ Currently, bytewax support for Hopsworks feature store is experimental and only
3636
that Feature group metadata needs to be registered in Hopsworks Feature store before you can write real time features computed
3737
by Bytewax.
3838

39-
Full documentation how to create feature group using HSFS APIs can be found [here](https://docs.hopsworks.ai/3.4/user_guides/fs/feature_group/create/).
39+
Full documentation how to create feature group using HSFS APIs can be found [here](https://docs.hopsworks.ai/latest/user_guides/fs/feature_group/create/).
4040

4141
This tutorial comes with a python program to create a feature group:
4242
- `python ./setup/feature_group.py`

integrations/federated-offline-query/federated-offline-query.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@
1212
"The aim of this tutorial is to create a unified view of features regarding the 100 most popular GitHub projects joining public datasets on Snowflake ([GitHub Archive](https://app.snowflake.com/marketplace/listing/GZTSZAS2KJ3/cybersyn-inc-github-archive?search=software&categorySecondary=%5B%2213%22%5D)). BigQuery ([deps.dev](https://console.cloud.google.com/marketplace/product/bigquery-public-data/deps-dev?hl=en)) and Hopsworks. We will create feature groups for each of these sources and then combine them in a unified view exposing all features together regardless of their source. We then use the view to create training data for a model predicting the code coverage of Github projects.\n",
1313
"\n",
1414
"## Prerequisites:\n",
15-
"* To follow this tutorial you can sign up for the [Hopsworks Free Tier](https://app.hopsworks.ai/) or use your own Hopsworks installation. You also need access to Snowflake and BigQuery, which offer free trials: [Snowflake Free Trial](https://signup.snowflake.com/?utm_source=google&utm_medium=paidsearch&utm_campaign=em-se-en-brand-trial-exact&utm_content=go-rsa-evg-ss-free-trial&utm_term=c-g-snowflake%20trial-e&_bt=591349674928&_bk=snowflake%20trial&_bm=e&_bn=g&_bg=129534995484&gclsrc=aw.ds&gad_source=1&gclid=EAIaIQobChMI0eeI-rPrggMVOQuiAx3WfgzdEAAYASAAEgIwS_D_BwE), [Google Cloud Free Tier](https://cloud.google.com/free?hl=en). If you choose to use your own Hopsworks, you should have an instance of Hopsworks version 3.5 or above and be the Data Owner/Author of a project. Furthermore, to use the Hopsworks Feature Query Service, the user has to configure the Hopsworks cluster to enable it. This can only be done during [cluster creation](https://docs.hopsworks.ai/3.5/setup_installation/common/arrow_flight_duckdb/)."
15+
"* To follow this tutorial you can sign up for the [Hopsworks Free Tier](https://app.hopsworks.ai/) or use your own Hopsworks installation. You also need access to Snowflake and BigQuery, which offer free trials: [Snowflake Free Trial](https://signup.snowflake.com/?utm_source=google&utm_medium=paidsearch&utm_campaign=em-se-en-brand-trial-exact&utm_content=go-rsa-evg-ss-free-trial&utm_term=c-g-snowflake%20trial-e&_bt=591349674928&_bk=snowflake%20trial&_bm=e&_bn=g&_bg=129534995484&gclsrc=aw.ds&gad_source=1&gclid=EAIaIQobChMI0eeI-rPrggMVOQuiAx3WfgzdEAAYASAAEgIwS_D_BwE), [Google Cloud Free Tier](https://cloud.google.com/free?hl=en). If you choose to use your own Hopsworks, you should have an instance of Hopsworks version 3.5 or above and be the Data Owner/Author of a project. Furthermore, to use the Hopsworks Feature Query Service, the user has to configure the Hopsworks cluster to enable it. This can only be done during [cluster creation](https://docs.hopsworks.ai/latest/setup_installation/common/arrow_flight_duckdb/)."
1616
]
1717
},
1818
{
@@ -33,7 +33,7 @@
3333
"source": [
3434
"## <span style='color:#ff5f27'> Set up the Snowflake and BigQuery in Hopsworks\n",
3535
"\n",
36-
"Hopsworks manages the connection to Snowflake and BigQuery through storage connectors. Follow the [Storage Connector Guides](https://docs.hopsworks.ai/3.5/user_guides/fs/storage_connector/) to configure storage connectors for Snowflake and BigQuery and name them **Snowflake** and **BigQuery**."
36+
"Hopsworks manages the connection to Snowflake and BigQuery through storage connectors. Follow the [Storage Connector Guides](https://docs.hopsworks.ai/latest/user_guides/fs/storage_connector/) to configure storage connectors for Snowflake and BigQuery and name them **Snowflake** and **BigQuery**."
3737
]
3838
},
3939
{

integrations/gcs/gcs_feature_pipeline_external.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,9 +7,9 @@
77
"source": [
88
"# <span style='color:#ff5f27'> 👨🏻‍🏫 GCS External Feature Group Creation</span>\n",
99
"\n",
10-
"Follow this [guide](https://docs.hopsworks.ai/3.1/user_guides/fs/storage_connector/creation/gcs/) to set up a connection to GCS.\n",
10+
"Follow this [guide](https://docs.hopsworks.ai/latest/user_guides/fs/storage_connector/creation/gcs/) to set up a connection to GCS.\n",
1111
"\n",
12-
"In addition, you can read about [External Feature Groups](https://docs.hopsworks.ai/3.0/user_guides/fs/feature_group/create_external/)."
12+
"In addition, you can read about [External Feature Groups](https://docs.hopsworks.ai/latest/user_guides/fs/feature_group/create_external/)."
1313
]
1414
},
1515
{

integrations/gcs/gcs_feature_pipeline_online_external.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,9 +7,9 @@
77
"source": [
88
"# <span style='color:#ff5f27'> 👨🏻‍🏫 GCS Online External Feature Group Creation</span>\n",
99
"\n",
10-
"Follow this [guide](https://docs.hopsworks.ai/3.1/user_guides/fs/storage_connector/creation/gcs/) to set up a connection to GCS.\n",
10+
"Follow this [guide](https://docs.hopsworks.ai/latest/user_guides/fs/storage_connector/creation/gcs/) to set up a connection to GCS.\n",
1111
"\n",
12-
"In addition, you can read about [External Feature Groups](https://docs.hopsworks.ai/3.0/user_guides/fs/feature_group/create_external/)."
12+
"In addition, you can read about [External Feature Groups](https://docs.hopsworks.ai/latest/user_guides/fs/feature_group/create_external/)."
1313
]
1414
},
1515
{

integrations/java/beam/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ Currently, Beam support for Hopsworks feature store is experimental and only wri
3232
that Feature group metadata needs to be registered in Hopsworks Feature store before you can write real time features computed
3333
by Bytewax.
3434

35-
Full documentation how to create feature group using HSFS APIs can be found [here](https://docs.hopsworks.ai/3.4/user_guides/fs/feature_group/create/).
35+
Full documentation how to create feature group using HSFS APIs can be found [here](https://docs.hopsworks.ai/latest/user_guides/fs/feature_group/create/).
3636

3737
This tutorial comes with a python program to create a feature group:
3838
- `python ./setup/feature_group.py`
@@ -46,7 +46,7 @@ service account has the Pub/Sub Admin role.
4646

4747
### Google Cloud Pub/Sub to Google Cloud Storage
4848
Now you ready to run a streaming pipeline using Beam and Google Cloud Dataflow. For this you need to
49-
have Hopsworks cluster host address, hopsworks project name and [api key](https://docs.hopsworks.ai/3.3/user_guides/projects/api_key/create_api_key/)
49+
have Hopsworks cluster host address, hopsworks project name and [api key](https://docs.hopsworks.ai/latest/user_guides/projects/api_key/create_api_key/)
5050

5151
Once you have the above define environment variables:
5252

integrations/java/java/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ mvn clean package
1515

1616
## Execute java application:
1717
Now you will create [connection](https://docs.hopsworks.ai/hopsworks-api/3.3/generated/api/connection/) with
18-
your Hopsworks cluster. For this you need to have Hopsworks cluster host address and [api key](https://docs.hopsworks.ai/3.3/user_guides/projects/api_key/create_api_key/)
18+
your Hopsworks cluster. For this you need to have Hopsworks cluster host address and [api key](https://docs.hopsworks.ai/latest/user_guides/projects/api_key/create_api_key/)
1919

2020
Then define environment variables
2121

integrations/neo4j/2_training_pipeline.ipynb

Lines changed: 2 additions & 38 deletions
Original file line numberDiff line numberDiff line change
@@ -132,7 +132,7 @@
132132
"cell_type": "markdown",
133133
"metadata": {},
134134
"source": [
135-
"###### <span style=\"color:#ff5f27;\"> 🤖 Transformation Functions </span>\n",
135+
"### <span style=\"color:#ff5f27;\"> 🤖 Transformation Functions </span>\n",
136136
"\n",
137137
"Transformation functions are a mathematical mapping of input data that may be stateful - requiring statistics from the partent feature view (such as number of instances of a category, or mean value of a numerical feature)\n",
138138
"\n",
@@ -438,42 +438,6 @@
438438
"}"
439439
]
440440
},
441-
{
442-
"cell_type": "markdown",
443-
"metadata": {},
444-
"source": [
445-
"### <span style=\"color:#ff5f27;\">⚙️ Model Schema</span>\n",
446-
"\n",
447-
"The model needs to be set up with a [Model Schema](https://docs.hopsworks.ai/3.0/user_guides/mlops/registry/model_schema/), which describes the inputs and outputs for a model.\n",
448-
"\n",
449-
"A Model Schema can be automatically generated from training examples, as shown below."
450-
]
451-
},
452-
{
453-
"cell_type": "code",
454-
"execution_count": null,
455-
"metadata": {},
456-
"outputs": [],
457-
"source": [
458-
"from hsml.schema import Schema\n",
459-
"from hsml.model_schema import ModelSchema\n",
460-
"\n",
461-
"# Define the input schema using the values of X_train\n",
462-
"input_schema = Schema(X_train)\n",
463-
"\n",
464-
"# Define the output schema using y_train\n",
465-
"output_schema = Schema(y_train)\n",
466-
"\n",
467-
"# Create a ModelSchema object specifying the input and output schemas\n",
468-
"model_schema = ModelSchema(\n",
469-
" input_schema=input_schema, \n",
470-
" output_schema=output_schema,\n",
471-
")\n",
472-
"\n",
473-
"# Convert the model schema to a dictionary for further inspection or serialization\n",
474-
"model_schema.to_dict()"
475-
]
476-
},
477441
{
478442
"cell_type": "markdown",
479443
"metadata": {},
@@ -511,9 +475,9 @@
511475
"mr_model = mr.tensorflow.create_model(\n",
512476
" name=\"aml_model\", # Specify the model name\n",
513477
" metrics=metrics, # Include model metrics\n",
514-
" model_schema=model_schema, # Include model schema\n",
515478
" description=\"Adversarial anomaly detection model.\", # Model description\n",
516479
" input_example=[\"70408aef\"], # Input example\n",
480+
" feature_view=feature_view,\n",
517481
")\n",
518482
"\n",
519483
"# Save the registered model to the model registry\n",

0 commit comments

Comments
 (0)