Skip to content

Commit d70369c

Browse files
authored
Bedrock Runtime: Remove explicit references to Llama 3 (#7444)
1 parent ac0d841 commit d70369c

File tree

7 files changed

+71
-23
lines changed

7 files changed

+71
-23
lines changed

.doc_gen/metadata/bedrock-runtime_metadata.yaml

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1001,9 +1001,9 @@ bedrock-runtime_InvokeModel_CohereCommandR:
10011001
bedrock-runtime: {InvokeModel}
10021002

10031003
bedrock-runtime_InvokeModel_MetaLlama3:
1004-
title: Invoke Meta Llama 3 on &BR; using the Invoke Model API
1005-
title_abbrev: "InvokeModel: Llama 3"
1006-
synopsis: send a text message to Meta Llama 3, using the Invoke Model API.
1004+
title: Invoke Meta Llama on &BR; using the Invoke Model API
1005+
title_abbrev: "InvokeModel"
1006+
synopsis: send a text message to Meta Llama, using the Invoke Model API.
10071007
category: Meta Llama
10081008
languages:
10091009
Java:
@@ -1233,9 +1233,9 @@ bedrock-runtime_InvokeModelWithResponseStream_CohereCommandR:
12331233
bedrock-runtime: {InvokeModel}
12341234

12351235
bedrock-runtime_InvokeModelWithResponseStream_MetaLlama3:
1236-
title: Invoke Meta Llama 3 on &BR; using the Invoke Model API with a response stream
1237-
title_abbrev: "InvokeModelWithResponseStream: Llama 3"
1238-
synopsis: send a text message to Meta Llama 3, using the Invoke Model API, and print the response stream.
1236+
title: Invoke Meta Llama on &BR; using the Invoke Model API with a response stream
1237+
title_abbrev: "InvokeModelWithResponseStream"
1238+
synopsis: send a text message to Meta Llama, using the Invoke Model API, and print the response stream.
12391239
category: Meta Llama
12401240
languages:
12411241
Java:

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -37,3 +37,4 @@ kotlin/services/**/gradle/
3737
kotlin/services/**/gradlew
3838
kotlin/services/**/gradlew.bat
3939
kotlin/services/**/.kotlin/
40+
/.local/

dotnetv3/Bedrock-runtime/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -77,8 +77,8 @@ functions within the same service.
7777

7878
- [Converse](Models/MetaLlama/Converse/Converse.cs#L4)
7979
- [ConverseStream](Models/MetaLlama/ConverseStream/ConverseStream.cs#L4)
80-
- [InvokeModel: Llama 3](Models/MetaLlama/Llama3_InvokeModel/InvokeModel.cs#L4)
81-
- [InvokeModelWithResponseStream: Llama 3](Models/MetaLlama/Llama3_InvokeModelWithResponseStream/InvokeModelWithResponseStream.cs#L4)
80+
- [InvokeModel](Models/MetaLlama/Llama3_InvokeModel/InvokeModel.cs#L4)
81+
- [InvokeModelWithResponseStream](Models/MetaLlama/Llama3_InvokeModelWithResponseStream/InvokeModelWithResponseStream.cs#L4)
8282

8383
### Mistral AI
8484

javascriptv3/example_code/bedrock-runtime/README.md

Lines changed: 2 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -46,11 +46,6 @@ functions within the same service.
4646
- [Invoke multiple foundation models on Amazon Bedrock](scenarios/cli_text_playground.js)
4747
- [Tool use with the Converse API](scenarios/converse_tool_scenario/converse-tool-scenario.js)
4848

49-
### AI21 Labs Jurassic-2
50-
51-
- [Converse](models/ai21LabsJurassic2/converse.js#L4)
52-
- [InvokeModel](models/ai21LabsJurassic2/invoke_model.js)
53-
5449
### Amazon Nova
5550

5651
- [Converse](models/amazonNovaText/converse.js#L4)
@@ -83,8 +78,8 @@ functions within the same service.
8378

8479
- [Converse](models/metaLlama/converse.js#L4)
8580
- [ConverseStream](models/metaLlama/converseStream.js#L4)
86-
- [InvokeModel: Llama 3](models/metaLlama/llama3/invoke_model_quickstart.js#L4)
87-
- [InvokeModelWithResponseStream: Llama 3](models/metaLlama/llama3/invoke_model_with_response_stream_quickstart.js#L4)
81+
- [InvokeModel](models/metaLlama/llama3/invoke_model_quickstart.js#L4)
82+
- [InvokeModelWithResponseStream](models/metaLlama/llama3/invoke_model_with_response_stream_quickstart.js#L4)
8883

8984
### Mistral AI
9085

javav2/example_code/bedrock-runtime/README.md

Lines changed: 37 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -34,6 +34,15 @@ For prerequisites, see the [README](../../README.md#Prerequisites) in the `javav
3434
> see [Model access](https://docs.aws.amazon.com/bedrock/latest/userguide/model-access.html).
3535
>
3636
<!--custom.prerequisites.end-->
37+
38+
### Scenarios
39+
40+
Code examples that show you how to accomplish a specific task by calling multiple
41+
functions within the same service.
42+
43+
- [Generate videos from text prompts using Amazon Bedrock](../../usecases/video_generation_bedrock_nova_reel/src/main/java/com/example/novareel/VideoGenerationService.java)
44+
- [Tool use with the Converse API](src/main/java/com/example/bedrockruntime/scenario/BedrockScenario.java)
45+
3746
### AI21 Labs Jurassic-2
3847

3948
- [Converse](src/main/java/com/example/bedrockruntime/models/ai21LabsJurassic2/Converse.java#L6)
@@ -43,6 +52,7 @@ For prerequisites, see the [README](../../README.md#Prerequisites) in the `javav
4352

4453
- [Converse](src/main/java/com/example/bedrockruntime/models/amazon/nova/text/ConverseAsync.java#L6)
4554
- [ConverseStream](src/main/java/com/example/bedrockruntime/models/amazon/nova/text/ConverseStream.java#L6)
55+
- [Scenario: Tool use with the Converse API](src/main/java/com/example/bedrockruntime/scenario/BedrockScenario.java#L15)
4656

4757
### Amazon Nova Canvas
4858

@@ -85,8 +95,8 @@ For prerequisites, see the [README](../../README.md#Prerequisites) in the `javav
8595

8696
- [Converse](src/main/java/com/example/bedrockruntime/models/metaLlama/Converse.java#L6)
8797
- [ConverseStream](src/main/java/com/example/bedrockruntime/models/metaLlama/ConverseStream.java#L6)
88-
- [InvokeModel: Llama 3](src/main/java/com/example/bedrockruntime/models/metaLlama/Llama3_InvokeModel.java#L6)
89-
- [InvokeModelWithResponseStream: Llama 3](src/main/java/com/example/bedrockruntime/models/metaLlama/Llama3_InvokeModelWithResponseStream.java#L6)
98+
- [InvokeModel](src/main/java/com/example/bedrockruntime/models/metaLlama/Llama3_InvokeModel.java#L6)
99+
- [InvokeModelWithResponseStream](src/main/java/com/example/bedrockruntime/models/metaLlama/Llama3_InvokeModelWithResponseStream.java#L6)
90100

91101
### Mistral AI
92102

@@ -111,7 +121,32 @@ For prerequisites, see the [README](../../README.md#Prerequisites) in the `javav
111121
<!--custom.instructions.start-->
112122
<!--custom.instructions.end-->
113123

124+
#### Generate videos from text prompts using Amazon Bedrock
125+
126+
This example shows you how to a Spring Boot app that generates videos from text prompts using Amazon Bedrock and the
127+
Nova-Reel model.
128+
129+
130+
<!--custom.scenario_prereqs.bedrock-runtime_Scenario_GenerateVideos_NovaReel.start-->
131+
<!--custom.scenario_prereqs.bedrock-runtime_Scenario_GenerateVideos_NovaReel.end-->
132+
133+
134+
<!--custom.scenarios.bedrock-runtime_Scenario_GenerateVideos_NovaReel.start-->
135+
<!--custom.scenarios.bedrock-runtime_Scenario_GenerateVideos_NovaReel.end-->
136+
137+
#### Tool use with the Converse API
138+
139+
This example shows you how to build a typical interaction between an application, a generative AI model, and connected
140+
tools or APIs to mediate interactions between the AI and the outside world. It uses the example of connecting an
141+
external weather API to the AI model so it can provide real-time weather information based on user input.
142+
143+
144+
<!--custom.scenario_prereqs.bedrock-runtime_Scenario_ToolUse.start-->
145+
<!--custom.scenario_prereqs.bedrock-runtime_Scenario_ToolUse.end-->
146+
114147

148+
<!--custom.scenarios.bedrock-runtime_Scenario_ToolUse.start-->
149+
<!--custom.scenarios.bedrock-runtime_Scenario_ToolUse.end-->
115150

116151
### Tests
117152

kotlin/services/bedrock-runtime/README.md

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -35,10 +35,6 @@ For prerequisites, see the [README](../../README.md#Prerequisites) in the `kotli
3535
- [Converse](src/main/kotlin/com/example/bedrockruntime/models/amazon/nova/text/Converse.kt#L6)
3636
- [ConverseStream](src/main/kotlin/com/example/bedrockruntime/models/amazon/nova/text/ConverseStream.kt#L6)
3737

38-
### Amazon Nova Canvas
39-
40-
- [InvokeModel](src/main/kotlin/com/example/bedrockruntime/models/amazon/nova/canvas/InvokeModel.kt#L6)
41-
4238
### Amazon Titan Text
4339

4440
- [InvokeModel](src/main/kotlin/com/example/bedrockruntime/models/amazon/titan/text/InvokeModel.kt#L6)

python/example_code/bedrock-runtime/README.md

Lines changed: 23 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -48,6 +48,7 @@ python -m pip install -r requirements.txt
4848
Code examples that show you how to accomplish a specific task by calling multiple
4949
functions within the same service.
5050

51+
- [Create and invoke a managed prompt](../bedrock-agent/prompts/scenario_get_started_with_prompts.py)
5152
- [Tool use with the Converse API](cross-model-scenarios/tool_use_demo/tool_use_demo.py)
5253

5354
### AI21 Labs Jurassic-2
@@ -105,8 +106,8 @@ functions within the same service.
105106

106107
- [Converse](models/meta_llama/converse.py#L4)
107108
- [ConverseStream](models/meta_llama/converse_stream.py#L4)
108-
- [InvokeModel: Llama 3](models/meta_llama/llama3_invoke_model.py#L4)
109-
- [InvokeModelWithResponseStream: Llama 3](models/meta_llama/llama3_invoke_model_with_response_stream.py#L4)
109+
- [InvokeModel](models/meta_llama/llama3_invoke_model.py#L4)
110+
- [InvokeModelWithResponseStream](models/meta_llama/llama3_invoke_model_with_response_stream.py#L4)
110111

111112
### Mistral AI
112113

@@ -153,6 +154,26 @@ This example shows you how to get started using Amazon Bedrock Runtime.
153154
python hello/hello_bedrock_runtime_invoke.py
154155
```
155156

157+
#### Create and invoke a managed prompt
158+
159+
This example shows you how to do the following:
160+
161+
- Create a managed prompt.
162+
- Create a version of the prompt.
163+
- Invoke the prompt using the version.
164+
- Clean up resources (optional).
165+
166+
<!--custom.scenario_prereqs.bedrock-agent_GettingStartedWithBedrockPrompts.start-->
167+
<!--custom.scenario_prereqs.bedrock-agent_GettingStartedWithBedrockPrompts.end-->
168+
169+
Start the example by running the following at a command prompt:
170+
171+
```
172+
python ../bedrock-agent/prompts/scenario_get_started_with_prompts.py
173+
```
174+
175+
<!--custom.scenarios.bedrock-agent_GettingStartedWithBedrockPrompts.start-->
176+
<!--custom.scenarios.bedrock-agent_GettingStartedWithBedrockPrompts.end-->
156177

157178
#### Tool use with the Converse API
158179

0 commit comments

Comments
 (0)