You don't want to use either of these approaches any more, as they're only valid for older (2.x) models.

Instead, if you need to get token counts for a message before generating a response, you will want to use the count_tokens endpoint (this link is for the Python SDK but is available in the Typescript one and via HTTP requests).

The other, more accurate option, would be to generate your message, and in the resulting Message object, get the exact number of tokens used for the input and output with message.usage.input_tokens and message.usage.output_tokens.

Answer from Kyle on Stack Overflow
🌐
GitHub
github.com › anthropics › anthropic-sdk-python
GitHub - anthropics/anthropic-sdk-python
The Anthropic Python library provides convenient access to the Anthropic REST API from any Python 3.9+ application. It includes type definitions for all request params and response fields, and offers both synchronous and asynchronous clients ...
Starred by 2.6K users
Forked by 415 users
Languages   Python
🌐
PyPI
pypi.org › project › anthropic › 0.3.9
Anthropic Python API Library
The Anthropic Python library provides convenient access to the Anthropic REST API from any Python 3.7+ application. It includes type definitions for all request params and response fields, and offers both synchronous and asynchronous clients ...
      » pip install anthropic
    
Published   Aug 12, 2023
Version   0.3.9
Discussions

Anthropic released an official Python SDK for Claude Code
What are some use cases for this? More on reddit.com
🌐 r/ClaudeAI
56
406
June 14, 2025
python - Best way to count tokens for Anthropic Claude Models using the API? - Stack Overflow
I'm working with Anthropic's Claude models and need to accurately count the number of tokens in my prompts and responses. I'm using the anthropic_bedrock Python client but recently came across an alternative method using the anthropic client. More on stackoverflow.com
🌐 stackoverflow.com
Ask HN: Python Meta-Client for OpenAI, Anthropic, Gemini and other LLM API-s?
Simon Willison llm seems close but seems to focus much on the CLI and not the library, also doesn't seem to give precise control for non-OpenAI models (might be wrong here) · context - i'm the repo maintainer More on news.ycombinator.com
🌐 news.ycombinator.com
2
2
December 31, 2023
Anthropic's Python SDK (safety-first language model APIs)
Released yesterday by Mike Lambert. Example call: def main(max_tokens_to_sample: int = 200): c = anthropic.Client(os.environ['ANTHROPIC_API_KEY']) response = c.completion_stream( prompt=f"{anthropic.HUMAN_PROMPT} How many toes do dogs have?\n{anthropic.AI_PROMPT}", stop_sequences = [anthropic.HUMAN_PROMPT], max_tokens_to_sample=max_tokens_to_sample, model = 'claude-v0', stream=True, ) for data in response: print(data) (skippable) Some background on Anthropic OpenAI has ~375 employees, Anthropic has ~45. OpenAI median comp is ~$620k , which is top of market, and the people at Anthropic (which split off from OpenAI) are just as talented. In my opinion their best work is the Transformer Circuits Thread Project . Anthropic raised $124 million in a Series A round in 2021, then a $580 million Series B. But the Series B was led by SBF, and it's unsure how the FTX crash affected them. OpenAI's latest funding round details here: ( https://fortune.com/2023/01/11/structure-openai-investment-microsoft/ ). For more info, try reading all posts with the "Anthropic" tag at the AI alignment forum . Docs coming soon This snippet is a confirmation that Anthropic is going to monetize API access to Claude, which I don't think anyone doubted, but it's evidence that keys will be sent out sooner rather than later. # NOTE: disabling_checks can lead to very poor sampling quality from our API. # _Please_ read the docs on "Claude instructions when using the API" before disabling this _validate_prompt(params['prompt']) Differences from OpenAI's API The OpenAI equivalent of this can be found at https://github.com/openai/openai-python . It's a lot more fleshed out from an engineering POV, but in principle the gap shouldn't be too hard to close rapidly, so I'd wait for the SDK to mature up to parity before passing high-level judgements. I'm interested in seeing what decisions they make in their API design because that could provide insights into what they're thinking on a strategic/philosophical/cultural level. I think in general you can tell a lot about a team by reading their code. With the above caveat in mind, to level the playing field a bit, let's look at the initial commit for OpenAI , and put it next to the initial commit by Anthropic . Noting down a few observations I found interesting (plus some commentary): OpenAI's library was forked from Stripe's. Anthropic's library seems to have been made from scratch. Of the 527 people on LinkedIn who have OpenAI listed as their current company (not sure if all 375 people who actually work at OpenAI are represented in that), only 11 worked previously at Stripe, which is a smaller proportion than I expected! However, one of those 11 people is Greg Brockman , President and Co-Founder of OpenAI, who is also the author of this commit! OpenAI was founded in 2015, and this commit is from 2020. It'd be cool to see where it diverged and what changes the OpenAI fork made to Stripe's code. Snooping the gitignore is kinda mandatory at this point. Anthropic initial: .env __pycache__ .DS_Store **/.DS_Store Hah! They're using mac :D OpenAI initial: *.egg-info __pycache__ /public/dist OpenAI's gitignore contains a few more things now, but git blame shows most of them are due to Azure endpoints for finetuning. Also one line that's dead code leftover from an old thing that got removed, but I'm not going to talk about the present version too much since we're comparing initial commits. Simplicity: OpenAI: 52 files, 8074 lines Anthropic: 9 files, 241 lines More on reddit.com
🌐 r/mlscaling
1
6
May 25, 2022
🌐
Claude
docs.claude.com › en › api › client-sdks
Client SDKs - Claude Docs
Additional configuration is needed to use Anthropic’s Client SDKs through a partner platform. If you are using Amazon Bedrock, see this guide; if you are using Google Cloud Vertex AI, see this guide; if you are using Microsoft Foundry, see this guide. Python library GitHub repo Requirements: Python 3.8+ Installation:
🌐
Reddit
reddit.com › r/claudeai › anthropic released an official python sdk for claude code
r/ClaudeAI on Reddit: Anthropic released an official Python SDK for Claude Code
June 14, 2025 -

Anthropic has officially released a Python SDK for Claude Code, and it’s built specifically with developers in mind. This makes it way easier to bring Claude’s code generation and tool use capabilities into your own Python projects

What it offers:

  • Tool use support

  • Streaming output

  • Async & sync support

  • File support

  • Built-in chat structure

GitHub repo: https://github.com/anthropics/claude-code-sdk-python

I'd love to hear your ideas on how you plan to put this to use

🌐
Anthropic
anthropic.com › news › agent-capabilities-api
New capabilities for building agents on the Anthropic API
May 22, 2025 - We're introducing a code execution tool on the Anthropic API, giving Claude the ability to run Python code in a sandboxed environment to produce computational results and data visualizations.
🌐
DEV Community
dev.to › thomastaylor › anthropic-claude-with-tools-using-python-sdk-2fio
Anthropic Claude with tools using Python SDK - DEV Community
April 21, 2024 - It will not provide any other information about the stock or company.", "input_schema": { "type": "object", "properties": { "symbol": { "type": "string", "description": "The stock ticker symbol, e.g. AAPL for Apple Inc.", } }, "required": ["symbol"], }, } ] response = talk( client, tools, "claude-3-haiku-20240307", "What is the price of Apple?" ) print(response) ... The current stock price for Apple (ticker symbol AAPL) is $150.00. ... The purpose of this post was to get you started by providing a foundational understanding of Anthropic tool usage using the Python SDK.
🌐
DataCamp
datacamp.com › tutorial › claude-sonnet-api-anthropic
Claude Sonnet 3.5 API Tutorial: Getting Started With Anthropic's API | DataCamp
June 26, 2024 - To connect through the Claude 3.5 Sonnet API, obtain your API key from Anthropic, install the anthropic Python library, and use it to send requests and receive responses from Claude 3.5 Sonnet.
Find elsewhere
🌐
PyPI
pypi.org › project › anthropic › 0.2.8
Anthropic Python SDK
This python repo provides access to Anthropic's safety-first language model APIs. For more information on our APIs, please check out Anthropic's documentation. ... import anthropic client = anthropic.Client(api_key=<insert token here>) client.XXX # look to examples/ directory for code demonstrations
      » pip install anthropic
    
Published   May 08, 2023
Version   0.2.8
🌐
Instructor
python.useinstructor.com › integrations › anthropic
Anthropic Claude Tutorial: Structured Outputs with Instructor - Instructor
import asyncio async_client = instructor.from_provider( "anthropic/claude-3-5-haiku-latest", async_client=True, mode=instructor.Mode.ANTHROPIC_TOOLS, ) async def extract_user(): return await async_client.create( messages=[{"role": "user", "content": "Extract: Jason is 25 years old"}], response_model=User, ) user = asyncio.run(extract_user()) print(user)
🌐
AWS
docs.aws.amazon.com › amazon bedrock › user guide › code examples for amazon bedrock using aws sdks › code examples for amazon bedrock runtime using aws sdks › anthropic claude for amazon bedrock runtime › invoke anthropic claude on amazon bedrock using bedrock's converse api
Invoke Anthropic Claude on Amazon Bedrock using Bedrock's Converse API - Amazon Bedrock
SDK for Python (Boto3) There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository · . Send a text message to Anthropic Claude, using Bedrock's Converse API. # Use the Conversation API to send a text message to Anthropic Claude. import boto3 from botocore.exceptions import ClientError # Create a Bedrock Runtime client in the AWS Region you want to use.
🌐
Anthropic
anthropic.com › api
Claude Developer Platform
Connect Claude to any remote MCP server without writing client code. ... Run Python code, create visualizations, and analyze data directly within API calls.
🌐
DataCamp
datacamp.com › tutorial › getting-started-with-claude-3-and-the-claude-3-api
Getting Started with Claude 3 and the Claude 3 API | DataCamp
March 13, 2024 - Follow the simple steps to sign ... model. ... The third way to access the Claude 3 models is through API. Antropic offers client Software Development Kits (SDKs) for Python and Typescript....
🌐
Hacker News
news.ycombinator.com › item
Ask HN: Python Meta-Client for OpenAI, Anthropic, Gemini and other LLM API-s? | Hacker News
December 31, 2023 - Simon Willison llm seems close but seems to focus much on the CLI and not the library, also doesn't seem to give precise control for non-OpenAI models (might be wrong here) · context - i'm the repo maintainer