If you are using Anthropic, you can use the AsyncAnthropicBedrock API.
from anthropic import AsyncAnthropicBedrock
model_id = "my_model_id"
user_message = "Hello Claude!"
client = AsyncAnthropicBedrock()
message = await client.messages.create(
model=model_id,
max_tokens=1024,
messages=[
{"role": "user", "content": user_message}
]
)
Answer from Willem on Stack OverflowGitHub
github.com › anthropics › anthropic-sdk-python
GitHub - anthropics/anthropic-sdk-python
For a more fully fledged example see examples/bedrock.py. This library also provides support for the Anthropic Vertex API if you install this library with the vertex extra, e.g. pip install -U anthropic[vertex]. You can then import and instantiate a separate AnthropicVertex/AsyncAnthropicVertex class, which has the same API as the base Anthropic/AsyncAnthropic class.
Starred by 2.6K users
Forked by 415 users
Languages Python
PyPI
pypi.org › project › anthropic-bedrock
Anthropic Bedrock Python API library
JavaScript is disabled in your browser · Please enable JavaScript to proceed · A required part of this site couldn’t load. This may be due to a browser extension, network issues, or browser settings. Please check your connection, disable any ad blockers, or try using a different browser
Videos
27:16
Building AI agents with Claude in Amazon Bedrock | Code w/ Claude ...
02:27
How do I access Anthropic Claude models on Amazon Bedrock? - YouTube
05:18
Claude API on AWS Bedrock - YouTube
20:03
Will Anthropic's MCP work with other LLMs? - YES, with Amazon Bedrock.
AWS | AI | Amazon Bedrock | How To Use Anthropic Claude 3 ...
10:32
How to Use Claude 3 Haiku Locally Using Amazon Bedrock in Boto3 ...
Top answer 1 of 2
2
If you are using Anthropic, you can use the AsyncAnthropicBedrock API.
from anthropic import AsyncAnthropicBedrock
model_id = "my_model_id"
user_message = "Hello Claude!"
client = AsyncAnthropicBedrock()
message = await client.messages.create(
model=model_id,
max_tokens=1024,
messages=[
{"role": "user", "content": user_message}
]
)
2 of 2
0
You can find an example an example with asyncio in the AWS documentation's code example repo.
Link: https://github.com/awsdocs/aws-doc-sdk-examples/blob/main/python/example_code/bedrock-runtime/models/anthropic_claude/converse_async.py
npm
npmjs.com › package › @anthropic-ai › bedrock-sdk
anthropic-ai/bedrock-sdk
import { AnthropicBedrock } from '@anthropic-ai/bedrock-sdk'; // Note: this assumes you have configured AWS credentials in a way // that the AWS Node SDK will recognise, typicaly a shared `~/.aws/credentials` // file or `AWS_ACCESS_KEY_ID` & `AWS_SECRET_ACCESS_KEY` environment variables. // // https://docs.aws.amazon.com/sdk-for-javascript/v3/developer-guide/setting-credentials-node.html const client = new AnthropicBedrock(); async function main() { const message = await client.messages.create({ model: 'anthropic.claude-3-5-sonnet-20241022-v2:0', messages: [ { role: 'user', content: 'Hello!', }, ], max_tokens: 1024, }); console.log(message); } main();
» npm install @anthropic-ai/bedrock-sdk
Published Nov 18, 2025
Version 0.26.0
Author Anthropic
AWS
docs.aws.amazon.com › amazon bedrock › user guide › code examples for amazon bedrock using aws sdks › code examples for amazon bedrock runtime using aws sdks › anthropic claude for amazon bedrock runtime › invoke anthropic claude on amazon bedrock using the invoke model api
Invoke Anthropic Claude on Amazon Bedrock using the Invoke Model API - Amazon Bedrock
* * To learn more about the Anthropic Messages API, go to: * https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-anthropic-claude-messages.html * * @param {string} prompt - The input text prompt for the model to complete. * @param {string} [modelId] - The ID of the model to use. Defaults to "anthropic.claude-3-haiku-20240307-v1:0". */ export const invokeModel = async ( prompt, modelId = "anthropic.claude-3-haiku-20240307-v1:0", ) => { // Create a new Bedrock Runtime client instance.
AWS
docs.aws.amazon.com › amazon bedrock › user guide › code examples for amazon bedrock using aws sdks › code examples for amazon bedrock runtime using aws sdks › anthropic claude for amazon bedrock runtime › invoke anthropic claude models on amazon bedrock using the invoke model api with a response stream
Invoke Anthropic Claude models on Amazon Bedrock using the Invoke Model API with a response stream - Amazon Bedrock
* * To learn more about the Anthropic Messages API, go to: * https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-anthropic-claude-messages.html * * @param {string} prompt - The input text prompt for the model to complete. * @param {string} [modelId] - The ID of the model to use. Defaults to "anthropic.claude-3-haiku-20240307-v1:0". */ export const invokeModel = async ( prompt, modelId = "anthropic.claude-3-haiku-20240307-v1:0", ) => { // Create a new Bedrock Runtime client instance.
AWS
docs.aws.amazon.com › amazon bedrock › user guide › code examples for amazon bedrock using aws sdks › code examples for amazon bedrock runtime using aws sdks › anthropic claude for amazon bedrock runtime › invoke anthropic claude on amazon bedrock using bedrock's converse api
Invoke Anthropic Claude on Amazon Bedrock using Bedrock's Converse API - Amazon Bedrock
Send a text message to Anthropic Claude, using Bedrock's Converse API. #[tokio::main] async fn main() -> Result<(), BedrockConverseError> { tracing_subscriber::fmt::init(); let sdk_config = aws_config::defaults(BehaviorVersion::latest()) .region(CLAUDE_REGION) .load() .await; let client = Client::new(&sdk_config); let response = client .converse() .model_id(MODEL_ID) .messages( Message::builder() .role(ConversationRole::User) .content(ContentBlock::Text(USER_MESSAGE.to_string())) .build() .map_err(|_| "failed to build message")?, ) .send() .await; match response { Ok(output) => { let text = ge
PyPI
pypi.org › project › anthropic › 0.4.1
anthropic · PyPI
The Anthropic Python library provides ... definitions for all request params and response fields, and offers both synchronous and asynchronous clients powered by httpx. For the AWS Bedrock API, see ......
» pip install anthropic
GitHub
github.com › anthropics › anthropic-bedrock-python
GitHub - anthropics/anthropic-bedrock-python
Starred by 50 users
Forked by 7 users
Claude
docs.claude.com › en › api › client-sdks
Client SDKs - Claude Docs
Additional configuration is needed to use Anthropic’s Client SDKs through a partner platform. If you are using Amazon Bedrock, see this guide; if you are using Google Cloud Vertex AI, see this guide; if you are using Microsoft Foundry, see this guide.
npm
npmjs.com › package › @anthropic-ai › sdk
anthropic-ai/sdk
List methods in the Anthropic API are paginated. You can use the for await … of syntax to iterate through items across all pages: async function fetchAllMessageBatches(params) { const allMessageBatches = []; // Automatically fetches more pages as needed.
» npm install @anthropic-ai/sdk
Published Dec 06, 2025
Version 0.71.2
Author Anthropic
GitHub
github.com › mustafaaljadery › anthropic-bedrock
GitHub - mustafaaljadery/anthropic-bedrock: NodeJS and Python library to interact with Anthropic's models via AWS Bedrock.
import Anthropic from "anthropic-bedrock"; const anthropic = new Anthropic({ access_key: process.env["AWS_ACCESS_KEY"], secret_key: process.env["AWS_SECRET_KEY"], }); async function main() { const completion = await anthropic.Completion.create( model: "anthropic.claude-v2", prompt: "In one sentence, what is good about the color blue?"
Author mustafaaljadery
GitHub
github.com › pipecat-ai › pipecat › issues › 1141
Documentation for implementation of Anthropic via Amazon Bedrock · Issue #1141 · pipecat-ai/pipecat
December 9, 2024 - From this pull request, my understanding is that you can switch the client to the Anthropic Bedrock client.
Published Feb 05, 2025
GitHub
microsoft.github.io › autogen › stable › › reference › python › autogen_ext.models.anthropic.html
autogen_ext.models.anthropic — AutoGen
Required if using a model from AWS bedrock. To use this client, you must install the Anthropic extension: ... import asyncio from autogen_ext.models.anthropic import AnthropicBedrockChatCompletionClient, BedrockInfo from autogen_core.models import UserMessage, ModelInfo async def main(): anthropic_client = AnthropicBedrockChatCompletionClient( model="anthropic.claude-3-5-sonnet-20240620-v1:0", temperature=0.1, model_info=ModelInfo( vision=False, function_calling=True, json_output=False, family="unknown", structured_output=True ), bedrock_info=BedrockInfo( aws_access_key="<aws_access_key>", aws_secret_key="<aws_secret_key>", aws_session_token="<aws_session_token>", aws_region="<aws_region>", ), ) result = await anthropic_client.create([UserMessage(content="What is the capital of France?", source="user")]) # type: ignore print(result) if __name__ == "__main__": asyncio.run(main())
GitHub
github.com › anthropics › anthropic-sdk-typescript › issues › 399
Bedrock: empty content for certain prompts with max_tokens 1 · Issue #399 · anthropics/anthropic-sdk-typescript
April 24, 2024 - import AnthropicBedrock from '@anthropic-ai/bedrock-sdk'; const client = new AnthropicBedrock(); async function main(prompt) { const message = await client.messages.create({ max_tokens: 1, stream: false, messages: [{role: 'user', content: prompt}], model: 'anthropic.claude-3-sonnet-20240229-v1:0', }); console.log('Response content: ', message.content); } await main('Explain quantum mechanics to a five year old'); // [] <-- ERROR await main('Translate Hello into French and Spanish'); // [ { type: 'text', text: 'Here' } ] dmellott ·
Published Apr 24, 2024
Instructor
python.useinstructor.com › integrations › bedrock
Structured Outputs with AWS Bedrock and Pydantic - Instructor
import instructor # Auto client with model specification client = instructor.from_provider("bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0") # The auto client automatically handles: # - AWS credential detection from environment # - Region configuration (defaults to us-east-1) # - Mode selection based on model (Claude models use BEDROCK_TOOLS) ... The _async argument to instructor.from_bedrock is deprecated.
AWS
docs.aws.amazon.com › amazon bedrock › user guide › amazon bedrock foundation model information › inference request parameters and response fields for foundation models › anthropic claude models › anthropic claude messages api
Anthropic Claude Messages API - Amazon Bedrock
The timeout period for inference calls to Anthropic Claude 3.7 Sonnet and Claude 4 models is 60 minutes. By default, AWS SDK clients timeout after 1 minute. We recommend that you increase the read timeout period of your AWS SDK client to at least 60 minutes.
GitHub
github.com › anthropics › anthropic-sdk-python › blob › main › README.md
anthropic-sdk-python/README.md at main · anthropics/anthropic-sdk-python
For a more fully fledged example see examples/bedrock.py. This library also provides support for the Anthropic Vertex API if you install this library with the vertex extra, e.g. pip install -U anthropic[vertex]. You can then import and instantiate a separate AnthropicVertex/AsyncAnthropicVertex class, which has the same API as the base Anthropic/AsyncAnthropic class.
Author anthropics
Philschmid
philschmid.github.io › easyllm › examples › bedrock-chat-completion-api
How to use Chat Completion clients with Amazon Bedrock - EasyLLM
We avoid blocking code execution by using asynchronous functions. Any questions on this? ... # example without a system message and debug flag on: response = bedrock.ChatCompletion.create( model="anthropic.claude-v2", messages=[ {"role": "user", "content": "Explain asynchronous programming in the style of the pirate Blackbeard."}, ] ) print(response['choices'][0]['message']['content'])