Skip to content

Bedrock Runtime: Remove explicit references to Llama 3 #7444

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
May 20, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 6 additions & 6 deletions .doc_gen/metadata/bedrock-runtime_metadata.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -1001,9 +1001,9 @@ bedrock-runtime_InvokeModel_CohereCommandR:
bedrock-runtime: {InvokeModel}

bedrock-runtime_InvokeModel_MetaLlama3:
title: Invoke Meta Llama 3 on &BR; using the Invoke Model API
title_abbrev: "InvokeModel: Llama 3"
synopsis: send a text message to Meta Llama 3, using the Invoke Model API.
title: Invoke Meta Llama on &BR; using the Invoke Model API
title_abbrev: "InvokeModel"
synopsis: send a text message to Meta Llama, using the Invoke Model API.
category: Meta Llama
languages:
Java:
Expand Down Expand Up @@ -1233,9 +1233,9 @@ bedrock-runtime_InvokeModelWithResponseStream_CohereCommandR:
bedrock-runtime: {InvokeModel}

bedrock-runtime_InvokeModelWithResponseStream_MetaLlama3:
title: Invoke Meta Llama 3 on &BR; using the Invoke Model API with a response stream
title_abbrev: "InvokeModelWithResponseStream: Llama 3"
synopsis: send a text message to Meta Llama 3, using the Invoke Model API, and print the response stream.
title: Invoke Meta Llama on &BR; using the Invoke Model API with a response stream
title_abbrev: "InvokeModelWithResponseStream"
synopsis: send a text message to Meta Llama, using the Invoke Model API, and print the response stream.
category: Meta Llama
languages:
Java:
Expand Down
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -37,3 +37,4 @@ kotlin/services/**/gradle/
kotlin/services/**/gradlew
kotlin/services/**/gradlew.bat
kotlin/services/**/.kotlin/
/.local/
4 changes: 2 additions & 2 deletions dotnetv3/Bedrock-runtime/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -77,8 +77,8 @@ functions within the same service.

- [Converse](Models/MetaLlama/Converse/Converse.cs#L4)
- [ConverseStream](Models/MetaLlama/ConverseStream/ConverseStream.cs#L4)
- [InvokeModel: Llama 3](Models/MetaLlama/Llama3_InvokeModel/InvokeModel.cs#L4)
- [InvokeModelWithResponseStream: Llama 3](Models/MetaLlama/Llama3_InvokeModelWithResponseStream/InvokeModelWithResponseStream.cs#L4)
- [InvokeModel](Models/MetaLlama/Llama3_InvokeModel/InvokeModel.cs#L4)
- [InvokeModelWithResponseStream](Models/MetaLlama/Llama3_InvokeModelWithResponseStream/InvokeModelWithResponseStream.cs#L4)

### Mistral AI

Expand Down
9 changes: 2 additions & 7 deletions javascriptv3/example_code/bedrock-runtime/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,11 +46,6 @@ functions within the same service.
- [Invoke multiple foundation models on Amazon Bedrock](scenarios/cli_text_playground.js)
- [Tool use with the Converse API](scenarios/converse_tool_scenario/converse-tool-scenario.js)

### AI21 Labs Jurassic-2

- [Converse](models/ai21LabsJurassic2/converse.js#L4)
- [InvokeModel](models/ai21LabsJurassic2/invoke_model.js)

### Amazon Nova

- [Converse](models/amazonNovaText/converse.js#L4)
Expand Down Expand Up @@ -83,8 +78,8 @@ functions within the same service.

- [Converse](models/metaLlama/converse.js#L4)
- [ConverseStream](models/metaLlama/converseStream.js#L4)
- [InvokeModel: Llama 3](models/metaLlama/llama3/invoke_model_quickstart.js#L4)
- [InvokeModelWithResponseStream: Llama 3](models/metaLlama/llama3/invoke_model_with_response_stream_quickstart.js#L4)
- [InvokeModel](models/metaLlama/llama3/invoke_model_quickstart.js#L4)
- [InvokeModelWithResponseStream](models/metaLlama/llama3/invoke_model_with_response_stream_quickstart.js#L4)

### Mistral AI

Expand Down
39 changes: 37 additions & 2 deletions javav2/example_code/bedrock-runtime/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,15 @@ For prerequisites, see the [README](../../README.md#Prerequisites) in the `javav
> see [Model access](https://docs.aws.amazon.com/bedrock/latest/userguide/model-access.html).
>
<!--custom.prerequisites.end-->

### Scenarios

Code examples that show you how to accomplish a specific task by calling multiple
functions within the same service.

- [Generate videos from text prompts using Amazon Bedrock](../../usecases/video_generation_bedrock_nova_reel/src/main/java/com/example/novareel/VideoGenerationService.java)
- [Tool use with the Converse API](src/main/java/com/example/bedrockruntime/scenario/BedrockScenario.java)

### AI21 Labs Jurassic-2

- [Converse](src/main/java/com/example/bedrockruntime/models/ai21LabsJurassic2/Converse.java#L6)
Expand All @@ -43,6 +52,7 @@ For prerequisites, see the [README](../../README.md#Prerequisites) in the `javav

- [Converse](src/main/java/com/example/bedrockruntime/models/amazon/nova/text/ConverseAsync.java#L6)
- [ConverseStream](src/main/java/com/example/bedrockruntime/models/amazon/nova/text/ConverseStream.java#L6)
- [Scenario: Tool use with the Converse API](src/main/java/com/example/bedrockruntime/scenario/BedrockScenario.java#L15)

### Amazon Nova Canvas

Expand Down Expand Up @@ -85,8 +95,8 @@ For prerequisites, see the [README](../../README.md#Prerequisites) in the `javav

- [Converse](src/main/java/com/example/bedrockruntime/models/metaLlama/Converse.java#L6)
- [ConverseStream](src/main/java/com/example/bedrockruntime/models/metaLlama/ConverseStream.java#L6)
- [InvokeModel: Llama 3](src/main/java/com/example/bedrockruntime/models/metaLlama/Llama3_InvokeModel.java#L6)
- [InvokeModelWithResponseStream: Llama 3](src/main/java/com/example/bedrockruntime/models/metaLlama/Llama3_InvokeModelWithResponseStream.java#L6)
- [InvokeModel](src/main/java/com/example/bedrockruntime/models/metaLlama/Llama3_InvokeModel.java#L6)
- [InvokeModelWithResponseStream](src/main/java/com/example/bedrockruntime/models/metaLlama/Llama3_InvokeModelWithResponseStream.java#L6)

### Mistral AI

Expand All @@ -111,7 +121,32 @@ For prerequisites, see the [README](../../README.md#Prerequisites) in the `javav
<!--custom.instructions.start-->
<!--custom.instructions.end-->

#### Generate videos from text prompts using Amazon Bedrock

This example shows you how to a Spring Boot app that generates videos from text prompts using Amazon Bedrock and the
Nova-Reel model.


<!--custom.scenario_prereqs.bedrock-runtime_Scenario_GenerateVideos_NovaReel.start-->
<!--custom.scenario_prereqs.bedrock-runtime_Scenario_GenerateVideos_NovaReel.end-->


<!--custom.scenarios.bedrock-runtime_Scenario_GenerateVideos_NovaReel.start-->
<!--custom.scenarios.bedrock-runtime_Scenario_GenerateVideos_NovaReel.end-->

#### Tool use with the Converse API

This example shows you how to build a typical interaction between an application, a generative AI model, and connected
tools or APIs to mediate interactions between the AI and the outside world. It uses the example of connecting an
external weather API to the AI model so it can provide real-time weather information based on user input.


<!--custom.scenario_prereqs.bedrock-runtime_Scenario_ToolUse.start-->
<!--custom.scenario_prereqs.bedrock-runtime_Scenario_ToolUse.end-->


<!--custom.scenarios.bedrock-runtime_Scenario_ToolUse.start-->
<!--custom.scenarios.bedrock-runtime_Scenario_ToolUse.end-->

### Tests

Expand Down
4 changes: 0 additions & 4 deletions kotlin/services/bedrock-runtime/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,10 +35,6 @@ For prerequisites, see the [README](../../README.md#Prerequisites) in the `kotli
- [Converse](src/main/kotlin/com/example/bedrockruntime/models/amazon/nova/text/Converse.kt#L6)
- [ConverseStream](src/main/kotlin/com/example/bedrockruntime/models/amazon/nova/text/ConverseStream.kt#L6)

### Amazon Nova Canvas

- [InvokeModel](src/main/kotlin/com/example/bedrockruntime/models/amazon/nova/canvas/InvokeModel.kt#L6)

### Amazon Titan Text

- [InvokeModel](src/main/kotlin/com/example/bedrockruntime/models/amazon/titan/text/InvokeModel.kt#L6)
Expand Down
25 changes: 23 additions & 2 deletions python/example_code/bedrock-runtime/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,7 @@ python -m pip install -r requirements.txt
Code examples that show you how to accomplish a specific task by calling multiple
functions within the same service.

- [Create and invoke a managed prompt](../bedrock-agent/prompts/scenario_get_started_with_prompts.py)
- [Tool use with the Converse API](cross-model-scenarios/tool_use_demo/tool_use_demo.py)

### AI21 Labs Jurassic-2
Expand Down Expand Up @@ -105,8 +106,8 @@ functions within the same service.

- [Converse](models/meta_llama/converse.py#L4)
- [ConverseStream](models/meta_llama/converse_stream.py#L4)
- [InvokeModel: Llama 3](models/meta_llama/llama3_invoke_model.py#L4)
- [InvokeModelWithResponseStream: Llama 3](models/meta_llama/llama3_invoke_model_with_response_stream.py#L4)
- [InvokeModel](models/meta_llama/llama3_invoke_model.py#L4)
- [InvokeModelWithResponseStream](models/meta_llama/llama3_invoke_model_with_response_stream.py#L4)

### Mistral AI

Expand Down Expand Up @@ -153,6 +154,26 @@ This example shows you how to get started using Amazon Bedrock Runtime.
python hello/hello_bedrock_runtime_invoke.py
```

#### Create and invoke a managed prompt

This example shows you how to do the following:

- Create a managed prompt.
- Create a version of the prompt.
- Invoke the prompt using the version.
- Clean up resources (optional).

<!--custom.scenario_prereqs.bedrock-agent_GettingStartedWithBedrockPrompts.start-->
<!--custom.scenario_prereqs.bedrock-agent_GettingStartedWithBedrockPrompts.end-->

Start the example by running the following at a command prompt:

```
python ../bedrock-agent/prompts/scenario_get_started_with_prompts.py
```

<!--custom.scenarios.bedrock-agent_GettingStartedWithBedrockPrompts.start-->
<!--custom.scenarios.bedrock-agent_GettingStartedWithBedrockPrompts.end-->

#### Tool use with the Converse API

Expand Down
Loading