🌐
GitHub
github.com › anthropics › claude-code › issues › 760
[BUG] API Error: 400 Input is too long for requested model. · Issue #760 · anthropics/claude-code
April 10, 2025 - Environment Platform (select one): Anthropic API AWS Bedrock Google Vertex AI Other: Claude CLI version: 0.2.67 Operating System: macOs 15.3.2 Terminal: Warp Bug Description Getting this error after an attempt of reading the large file, ...
Published   Apr 10, 2025
🌐
Stack Overflow
stackoverflow.com › questions › 78836086 › why-do-i-get-an-error-saying-input-is-too-long-for-requested-model-with-my-cla
python - Why do I get an error saying 'Input is too long for requested model' with my Claude API call on AWS Bedrock? - Stack Overflow
response = client_bedrock.invoke_model(modelId=anthropic_modelid, contentType='application/json',accept='application/json',body=json.dumps(anthropic_payload)) Where "promptMessages" in an array of messages with the first message including a "<document> ....</document> that is a text representation of a number of different file formats uploaded to S3 · I tried a truncated string length and it worked; the full text fails. ... "API Error: 400 invalid beta flag" when trying to use Claude Code with Bedrock using claude 3.5 haiku
🌐
GitHub
github.com › RooCodeInc › Roo-Code › issues › 772
Claude 3.5 Token Limit Error (400) Prevents Workflow Continuation · Issue #772 · RooCodeInc/Roo-Code
February 4, 2025 - This issue is also present in the original Cline project (cline/cline#1275). When an API request exceeds the maximum token limit (200k tokens), the system returns a 400 error as expected.
Published   Feb 04, 2025
🌐
AWS re:Post
repost.aws › questions › QUshd0uzCZRAy1TbudkUKhww › claude-on-bedrock-giving-input-is-too-long-for-requested-model-for-10k-token-inputs-edit-broken-in-eu-central-1-working-in-other-regions
Claude on Bedrock giving "Input is too long for requested model" for ~10k token inputs (edit: broken in eu-central-1, working in other regions) | AWS re:Post
October 27, 2023 - Ideally that error message is a bit better, like "Your input tokens is 70,000 and you've asked for 90,000 back, this exceeds the model's capability." ... So i re-tested this last night in eu-central-1 and managed to send about 70k tokens through and it worked so think someone might have fixed it. Let me know if someone else is able to double-check this. ... @DaveM Reproduction case in first post still fails for me in Frankfurt using Claude v2 via the Chat Playground in Bedrock UI.
🌐
Claude
docs.claude.com › en › api › errors
Errors - Claude Docs
400 - invalid_request_error: There was an issue with the format or content of your request. We may also use this error type for other 4XX status codes not listed below. 401 - authentication_error: There’s an issue with your API key.
🌐
GitHub
github.com › anthropics › claude-code › issues › 558
API Error: 400 while using the Claude Code in my Terminal · Issue #558 · anthropics/claude-code
March 19, 2025 - Hi there I have been facing the same error "API Error: 400" for the last three days. ⎿ API Error: 400 I have taken every single step to solve this issue, but nothing has worked. My Anthro...
Published   Mar 19, 2025
🌐
GitHub
github.com › anthropics › claude-code › issues › 5220
Low Context Management in Claude Code CLI · Issue #5220 · anthropics/claude-code
August 6, 2025 - Please use pagination, filtering, or limit parameters to reduce the response size.\n at zN0 (file:///Users/user/.claude/local/node_modules/@anthropic-ai/claude-code/cli.js:1295:7872)\n at process.processTicksAndRejections (node:internal/pro...
Published   Aug 06, 2025
🌐
n8n
community.n8n.io › help me build my workflow
Input is too long for requested model error - Help me Build my Workflow - n8n Community
August 14, 2025 - The model returned the following errors: Input is too long for requested model. i am getting this error when the MCP client node is trying to fetch some detail and the output is too long thus the bedrock is unable to pro…
🌐
X
x.com › gecko655 › status › 1945443019383333165
gecko655 on X: "Claude Code くん、 "API Error: 400 Input is too long for requested model." もなんとか自動でリトライしてくれないかな…" / X
Claude Code くん、 "API Error: 400 Input is too long for requested model." もなんとか自動でリトライしてくれないかな…Translate post
Find elsewhere
🌐
Portkey
portkey.ai › error-library › input-length-error-10153
[Solved] Prompt is too long: the number of tokens exceeds the maximum allowed limit.
Token Limit Exceeded: The prompt contains more tokens than the maximum allowed by the Anthropic API. Each API has a specified limit on the number of tokens that can be processed in a single request, and exceeding this limit results in a 400 error.
🌐
Stack Overflow
stackoverflow.com › questions › 79683907 › error-in-the-name-of-claude-4-sonnet-while-using-in-cursor-as-custom-model
Error in the name of Claude 4 sonnet while using in cursor as custom model - Stack Overflow
Since you are getting 400, it means the provider your Cursor is pointing at is not recognising the model string which is claude-4-sonnet-20250522 in your case. Figure out the correct model string based on your provider and try again.
🌐
Stack Overflow
stackoverflow.com › questions › 79632861 › api-error-400-invalid-beta-flag-when-trying-to-use-claude-code-with-bedrock-u
"API Error: 400 invalid beta flag" when trying to use Claude Code with Bedrock using claude 3.5 haiku - Stack Overflow
$ ANTHROPIC_MODEL='anthropic.claude-3-5-haiku-20241022-v1:0' claude -p hi API Error: 400 invalid beta flag $ ANTHROPIC_MODEL='us.anthropic.claude-3-5-haiku-20241022-v1:0' claude -p hi API Error: 400 invalid beta flag ... The claude CLI always tacks that beta header on every request because it’s tuned for Sonnet 3.7, which does support the “token-efficient-tools” beta.
🌐
X
x.com › headinthebox › status › 1894235434202665104
Erik Meijer on X: "If you are using Claude Code, make sure to hit /clear early and often. Otherwise you will be burning a lot of tokens. API Error: 400 {"type":"error","error":{"type":"invalid_request_error","message":"input length and `max_tokens` exceed context limit: 185054 + 20000 > 204798," / X
If you are using Claude Code, make sure to hit /clear early and often. Otherwise you will be burning a lot of tokens. API Error: 400 {"type":"error","error":{"type":"invalid_request_error","message":"input length and `max_tokens` exceed context ...
🌐
Portkey
portkey.ai › error-library › input-length-error-10001
Portkey | Control Panel for Production AI
With 3 lines of code, Portkey empowers AI teams to observe, govern, and optimize your apps across the entire org.
🌐
GitHub
github.com › anthropics › claude-code › issues › 476
[BUG} API Error: 400 'input length and `max_tokens` exceed context limit' occurs when close to compaction · Issue #476 · anthropics/claude-code
March 14, 2025 - API Error: 400 {"type":"error","error":{"type":"invalid_request_error","message":"input length and max_tokens exceed context limit: 186433 + 20000 > 200000, decrease input ... Error not thrown and handled gracefully, either with meaningful error message with user next steps, or transparently. I think i would also expect that claude code also prompts at suitable times to compact.
Published   Mar 14, 2025
Top answer
1 of 3
11

As the error says, you must provide the ID of an inference profile and not the model for this particular model. The easiest way to do this is to provide the ID of a system-defined inference profile for this model. You can find it by invoking this awscli command with the correct credentials defined in the environment (or set via standard flags):

aws bedrock list-inference-profiles

You will see this one in the JSON list:

{
  "inferenceProfileName": "US Anthropic Claude 3.5 Sonnet v2",
  "description": "Routes requests to Anthropic Claude 3.5 Sonnet v2 in us-west-2, us-east-1 and us-east-2.",
  "inferenceProfileArn": "arn:aws:bedrock:us-east-1:381492273274:inference-profile/us.anthropic.claude-3-5-sonnet-20241022-v2:0",
  "models": [
    {
      "modelArn": "arn:aws:bedrock:us-west-2::foundation-model/anthropic.claude-3-5-sonnet-20241022-v2:0"
    },
    {
      "modelArn": "arn:aws:bedrock:us-east-1::foundation-model/anthropic.claude-3-5-sonnet-20241022-v2:0"
    },
    {
      "modelArn": "arn:aws:bedrock:us-east-2::foundation-model/anthropic.claude-3-5-sonnet-20241022-v2:0"
    }
  ],
  "inferenceProfileId": "us.anthropic.claude-3-5-sonnet-20241022-v2:0",
  "status": "ACTIVE",
  "type": "SYSTEM_DEFINED"
}

Modify the invoke_model line in your code to specify the ID or ARN of the inference profile instead:

response = bedrock_runtime.invoke_model(
  body=body,
  modelId="us.anthropic.claude-3-5-sonnet-20241022-v2:0",
)
2 of 3
0

You can add the ARN of an Inference profile to your modelID itself while invoking the model.

response = bedrock_client.invoke_model(

       modelId="arn:aws:bedrock:us-east-1::model/your-bedrock-model-arn"  

       prompt="Your prompt here"

   )
🌐
Reddit
reddit.com › r/claudecode › using claude code via litellm? here’s how to fix common 400 api errors
r/ClaudeCode on Reddit: Using Claude Code via LiteLLM? Here’s How to Fix Common 400 API Errors
3 weeks ago - ⎿ API Error: 400 {"error":{"message":"{\"message\":\"tools.3.custom.input_examples: Extra inputs are not permitted\"}"}} the root cause is LiteLLM automatically enabling Anthropic’s experimental betas, which Claude Code version may not support. This causes LiteLLM to inject a header (anthropic-beta: tool-examples-2025-10-29) and sometimes additional tool metadata—both of which trigger 400 errors
🌐
Reddit
reddit.com › r/claude › api error 400: input length and `max_tokens` exceed context limit: 199926 + 21333 > 200000, decrease input length or `max_tokens` and try again"
r/claude on Reddit: API Error 400: input length and `max_tokens` exceed context limit: 199926 + 21333 > 200000, decrease input length or `max_tokens` and try again"
August 3, 2025 -

Has anyone seen this error before?

API Error: 400 {"type":"error","error":{"type":"invalid_request_error","message":"input length and `max_tokens` exceed context limit: 199926 + 21333 > 200000, decrease input length or `max_tokens` and try again"}}

What's strange is that I'm not one of those crazy Claude Code users that have it running in multiple terminals ...etc

A single terminal, and give it simple instructions one after the other.

What I had told it was:

Let's now add integration tests. They should sit under the following directory: <directory-path>

We use Test Containers for integration tests. Take a look at existing tests for conventions and libraries used.

And I got that error.