GitHub
github.com › anthropics › anthropic-sdk-python
GitHub - anthropics/anthropic-sdk-python
By default, the async client uses httpx for HTTP requests. However, for improved concurrency performance you may also use aiohttp as the HTTP backend. ... import asyncio from anthropic import DefaultAioHttpClient from anthropic import AsyncAnthropic async def main() -> None: async with AsyncAnthropic( api_key="my-anthropic-api-key", http_client=DefaultAioHttpClient(), ) as client: message = await client.messages.create( max_tokens=1024, messages=[ { "role": "user", "content": "Hello, Claude", } ], model="claude-sonnet-4-5-20250929", ) print(message.content) asyncio.run(main())
Starred by 2.6K users
Forked by 415 users
Languages Python
PyPI
pypi.org › project › anthropic › 0.3.9
anthropic · PyPI
- client = anthropic.Client(os.environ["ANTHROPIC_API_KEY"]) + client = anthropic.AsyncAnthropic(api_key=os.environ["ANTHROPIC_API_KEY"]) - await client.acompletion(**params) + await client.completions.create(**params)
» pip install anthropic
GitHub
github.com › anthropics › anthropic-sdk-python › blob › main › helpers.md
anthropic-sdk-python/helpers.md at main · anthropics/anthropic-sdk-python
The synchronous client has the same interface just without async/await. Provides an iterator over just the text deltas in the stream: async for text in stream.text_stream: print(text, end="", flush=True) print() The events listed here are just the event types that the SDK extends, for a full list of the events returned by the API, see these docs. from anthropic import AsyncAnthropic client = AsyncAnthropic() async with client.messages.stream( max_tokens=1024, messages=[ { "role": "user", "content": "Say hello there!", } ], model="claude-3-5-sonnet-latest", ) as stream: async for event in strea
Author anthropics
Claude
docs.claude.com › en › api › client-sdks
Client SDKs - Claude Docs
We provide client libraries in a number of popular languages that make it easier to work with the Claude API. ... This page includes brief installation instructions and links to the open-source GitHub repositories for Anthropic’s Client SDKs.
GitHub
github.com › anthropics › anthropic-sdk-java
GitHub - anthropics/anthropic-sdk-java
To switch to asynchronous execution, call the async() method: import com.anthropic.client.AnthropicClient; import com.anthropic.client.okhttp.AnthropicOkHttpClient; import com.anthropic.models.messages.Message; import com.anthropic.models.m...
Starred by 191 users
Forked by 37 users
Languages Kotlin
CodeSignal
codesignal.com › learn › courses › parallelizing-claude-agentic-systems-in-python › lessons › concurrent-agent-conversations
Going Async with Claude Agents | CodeSignal Learn
This single-line change tells your agent to use the async version of the Anthropic client. The AsyncAnthropic client has the same interface as the regular client, but its methods return "awaitables" instead of immediate results.
npm
npmjs.com › package › @anthropic-ai › sdk
anthropic-ai/sdk
You can access most beta API features through the beta property of the client. To enable a particular beta feature, you need to add the appropriate beta header to the betas field when creating a message. ... import Anthropic from 'npm:@anthropic-ai/sdk'; const client = new Anthropic(); const response = await client.beta.messages.create({ max_tokens: 1024, model: 'claude-sonnet-4-5-20250929', messages: [ { role: 'user', content: [ { type: 'text', text: "What's 4242424242 * 4242424242?.", }, ], }, ], tools: [ { name: 'code_execution', type: 'code_execution_20250522', }, ], betas: ['code-execution-2025-05-22'], });
» npm install @anthropic-ai/sdk
Published Dec 06, 2025
Version 0.71.2
Author Anthropic
Stevanfreeborn
anthropicclient.stevanfreeborn.com
AnthropicClient | AnthropicClient
If you encounter any issues while using this library please open an issue here. This library is licensed under the MIT License and is free to use and modify. If you would like to contribute to this library please open a pull request here. Used to support async interfaces when streaming messages
Argilla-io
argilla-io.github.io › distilabel › 1.2.1 › api › llm › anthropic
Anthropic - Distilabel Docs
Generate text: ```python from distilabel.llms import AnthropicLLM llm = AnthropicLLM(model="claude-3-opus-20240229", api_key="api.key") llm.load() # Synchronous request output = llm.generate(inputs=[[{"role": "user", "content": "Hello world!"}]]) # Asynchronous request output = await llm.agenerate(input=[{"role": "user", "content": "Hello world!"}]) ``` Generate structured data: ```python from pydantic import BaseModel from distilabel.llms import AnthropicLLM class User(BaseModel): name: str last_name: str id: int llm = AnthropicLLM( model="claude-3-opus-20240229", api_key="api.key", structured_output={"schema": User} ) llm.load() output = llm.generate(inputs=[[{"role": "user", "content": "Create a user profile for the following marathon"}]]) ```
Daniel Liden
danliden.com › notes › 20240418-instructor-async.html
Asynchronous Instructor
April 18, 2024 - import instructor from anthropic import Anthropic, AsyncAnthropic from pydantic import BaseModel from dotenv import load_dotenv from pydantic import BaseModel, Field # I had ANTHROPIC_API_KEY in a .env file load_dotenv() # set up the async client aclient = instructor.from_anthropic(AsyncAnthropic()) # set up pydantic schema class Topics(BaseModel): candidates: list = Field( description=( "List of topics that might be considered among the main topics of the talk."
SourceForge
sourceforge.net › projects › anthropic-sdk-python.mirror
Anthropic SDK Python download | SourceForge.net
The SDK supports both synchronous and asynchronous usage (via async/await) depending on context. Importantly, it also supports streaming responses via Server-Sent Events (SSE) so that large outputs can be consumed incrementally rather than waiting ...
PydanticAI
ai.pydantic.dev › models › anthropic
Anthropic - Pydantic AI
from httpx import AsyncClient from pydantic_ai import Agent from pydantic_ai.models.anthropic import AnthropicModel from pydantic_ai.providers.anthropic import AnthropicProvider custom_http_client = AsyncClient(timeout=30) model = AnthropicModel( 'claude-sonnet-4-5', provider=AnthropicProvider(api_key='your-api-key', http_client=custom_http_client), ) agent = Agent(model) ...
Top answer 1 of 2
3
The Anthropic client needs to be closed:
import asyncio
from time import perf_counter
import numpy as np
from anthropic import Anthropic, AsyncAnthropic
async def coroutine1():
prompt = "Who discovered gravity. Answer in 10 words."
client = AsyncAnthropic()
message = await client.messages.create(
model="claude-3-5-sonnet-20240620",
max_tokens=1024,
messages=[{"role": "user", "content": prompt}],
)
await client.close()
return message.content[0].text
async def coroutine2():
prompt = "Who discovered radioactivity. Answer in 10 words."
client = AsyncAnthropic()
message = await client.messages.create(
model="claude-3-5-sonnet-20240620",
max_tokens=1024,
messages=[{"role": "user", "content": prompt}],
)
await client.close()
return message.content[0].text
async def main():
async with asyncio.TaskGroup() as tg:
t0 = tg.create_task(coroutine1())
t1 = tg.create_task(coroutine2())
print(t0.result())
print(t1.result())
if __name__ == "__main__":
for i in range(10):
tic = perf_counter()
asyncio.run(main())
toc = perf_counter()
print(f"Elapsed time = {toc - tic:.3f}")
2 of 2
0
My opinion is that there is a library problem since this code is working well:
import asyncio
import aiohttp
from time import perf_counter
async def coroutine1():
url = "https://www.google.com"
async with aiohttp.ClientSession() as session:
async with session.get(url) as response:
content = await response.text()
return content[:100] # return a snippet of the content for demonstration
async def coroutine2():
url = "https://www.google.com"
async with aiohttp.ClientSession() as session:
async with session.get(url) as response:
content = await response.text()
return content[:100] # return a snippet of the content for demonstration
async def main():
async with asyncio.TaskGroup() as tg:
t0 = tg.create_task(coroutine1())
t1 = tg.create_task(coroutine2())
print(t0.result())
print(t1.result())
if __name__ == "__main__":
for i in range(10):
tic = perf_counter()
asyncio.run(main())
toc = perf_counter()
print(f"Elapsed time = {toc - tic:.3f}")
Instructor
python.useinstructor.com › blog › category › anthropic
anthropic - Instructor
A special shoutout to Shreya for her contributions to the anthropic support. As of now, all features are operational with the exception of streaming support. For those eager to experiment, simply patch the client with ANTHROPIC_JSON, which will enable you to leverage the anthropic client for making requests.