Skip to content
This repository was archived by the owner on Jul 9, 2025. It is now read-only.

Commit cb970f9

Browse files
silv-ioHarshCasper
andauthored
Add Bedrock service documentation (#1523)
Co-authored-by: Harsh Mishra <erbeusgriffincasper@gmail.com>
1 parent ddc475c commit cb970f9

File tree

2 files changed

+85
-0
lines changed

2 files changed

+85
-0
lines changed

content/en/references/configuration.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -90,6 +90,12 @@ This section covers configuration options that are specific to certain AWS servi
9090
| - | - | - |
9191
| `BATCH_DOCKER_FLAGS` | `-e TEST_ENV=1337` | Additional flags provided to the batch container. Same restrictions as `LAMBDA_DOCKER_FLAGS`. |
9292

93+
### Bedrock
94+
95+
| Variable | Example Values | Description |
96+
| - | - | - |
97+
| `LOCALSTACK_ENABLE_BEDROCK` | `1` | Use the Bedrock provider |
98+
9399
### BigData (EMR, Athena, Glue)
94100

95101
| Variable | Example Values | Description |
Lines changed: 79 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,79 @@
1+
---
2+
title: "Bedrock"
3+
linkTitle: "Bedrock"
4+
description: Use foundation models running on your device with LocalStack!
5+
tags: ["Enterprise image"]
6+
---
7+
8+
## Introduction
9+
10+
Bedrock is a fully managed service provided by Amazon Web Services (AWS) that makes foundation models from various LLM providers accessible via an API.
11+
LocalStack allows you to use the Bedrock APIs to test and develop AI-powered applications in your local environment.
12+
The supported APIs are available on our [API Coverage Page](https://docs.localstack.cloud/references/coverage/coverage_bedrock/), which provides information on the extent of Bedrock's integration with LocalStack.
13+
14+
## Getting started
15+
16+
This guide is designed for users new to AWS Bedrock and assumes basic knowledge of the AWS CLI and our `awslocal` wrapper script.
17+
18+
Start your LocalStack container using your preferred method using the `LOCALSTACK_ENABLE_BEDROCK=1` configuration variable.
19+
We will demonstrate how to use Bedrock by following these steps:
20+
21+
1. Listing available foundation models
22+
2. Invoking a model for inference
23+
3. Using the conversation API
24+
25+
### List available foundation models
26+
27+
You can view all available foundation models using the [`ListFoundationModels`](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_ListFoundationModels.html) API.
28+
This will show you which models are available for use in your local environment.
29+
30+
Run the following command:
31+
32+
{{< command >}}
33+
$ awslocal bedrock list-foundation-models
34+
{{< / command >}}
35+
36+
### Invoke a model
37+
38+
You can use the [`InvokeModel`](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_InvokeModel.html) API to send requests to a specific model.
39+
In this example, we'll use the Llama 3 model to process a simple prompt.
40+
41+
Run the following command:
42+
43+
{{< command >}}
44+
$ awslocal bedrock-runtime invoke-model \
45+
--model-id "meta.llama3-8b-instruct-v1:0" \
46+
--body '{
47+
"prompt": "<|begin_of_text|><|start_header_id|>user<|end_header_id|>\nSay Hello!\n<|eot_id|>\n<|start_header_id|>assistant<|end_header_id|>",
48+
"max_gen_len": 2,
49+
"temperature": 0.9
50+
}' --cli-binary-format raw-in-base64-out outfile.txt
51+
{{< / command >}}
52+
53+
The output will be available in the `outfile.txt`.
54+
55+
### Use the conversation API
56+
57+
Bedrock provides a higher-level conversation API that makes it easier to maintain context in a chat-like interaction using the [`Converse`](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_Converse.html) API.
58+
You can specify both system prompts and user messages.
59+
60+
Run the following command:
61+
62+
{{< command >}}
63+
$ awslocal bedrock-runtime converse \
64+
--model-id "meta.llama3-8b-instruct-v1:0" \
65+
--messages '[{
66+
"role": "user",
67+
"content": [{
68+
"text": "Say Hello!"
69+
}]
70+
}]' \
71+
--system '[{
72+
"text": "You'\''re a chatbot that can only say '\''Hello!'\''"
73+
}]'
74+
{{< / command >}}
75+
76+
## Limitations
77+
78+
* LocalStack Bedrock implementation is mock-only and does not run any LLM model locally.
79+
* Currently, GPU models are not supported by the LocalStack Bedrock implementation.

0 commit comments

Comments
 (0)