🌐
Reddit
reddit.com › r/claudeai › how to make claude code use other models
r/ClaudeAI on Reddit: How to Make Claude Code Use Other Models
July 17, 2025 - How to Make Claude Code Use Other Models · r/programming • · comment · The stupidest thing about Claude Code is probably this... r/ClaudeAI • · upvotes · · comments · Claude Code is Too Great · r/ClaudeAI • · upvotes · · comments · Made a tool to run Claude Code with other models (including free ones) r/ClaudeAI • ·
🌐
GitHub
github.com › ruvnet › claude-flow › wiki › Using-Claude-Code-with-Open-Models
Using Claude Code with Open Models
Claude Code now believes it is talking to Anthropic yet routes to your open model because TGI mirrors the OpenAI schema. Test: ... Streaming works—TGI returns token streams under /v1/chat/completions just like the real OpenAI API. (Hugging Face) HF Inference Endpoints auto-scales, so watch credit burn. (Hugging Face) If you need local control, run TGI in Docker with docker run --name tgi -p 8080:80 ...
Author   ruvnet
🌐
Medium
medium.com › @302.AI › 302-ai-claude-compatible-let-claude-code-support-any-third-party-models-8e6dc96d0310
302.AI Claude Compatible: Let Claude Code support any third-party models | by 302.AI | Medium
July 28, 2025 - However, the significant pain point of Claude Code is that it can only call Anthropic’s own Claude model. Even if users attempt to modify the base URL and connect to other AI model services, they will only be able to use the Claude model.
🌐
Claude
support.claude.com › en › articles › 11940350-claude-code-model-configuration
Claude Code Model Configuration | Claude Help Center
Use the --model flag when starting Claude Code. Start a fresh Terminal session. Enter the following commands (depending on the model you’d like to use for that session): For Opus 4.5: claude --model claude-opus-4-5-20251101 · For Haiku 4.5: ...
🌐
PixelsTech
pixelstech.net › home › articles › how to make claude code use other models
How to Make Claude Code Use Other Models | PixelsTech
June 27, 2025 - But there is a workaround—just set up an apiKeyHelper. First, add the following to ~/.claude/settings.json (create the file if it doesn’t exist): ... Replace the parameters with your target provider and model.
🌐
Reddit
reddit.com › r/claudeai › made a tool to run claude code with other models (including free ones)
r/ClaudeAI on Reddit: Made a tool to run Claude Code with other models (including free ones)
1 month ago -

Got tired of being locked to Anthropic models in Claude Code. Built a proxy that lets you use 580+ models via OpenRouter while keeping the full Claude Code experience.

What it does:

  • Use Gemini, GPT, Grok, DeepSeek, Llama — whatever — inside Claude Code

  • Works with your existing Claude subscription (native passthrough, no markup)

  • Or run completely free using OpenRouter's free tier (actual good models, not garbage)

  • Multi-agent setup: map different models to opus/sonnet/haiku/subagent roles

Install:

npm install -g claudish
claudish --free

That's it. No config.

How it works:

Sits between Claude Code and the API. Translates Anthropic's tool format to OpenAI/Gemini JSON and back. Zero patches to the Claude Code binary, so it doesn't break when Anthropic pushes updates.

Everything still works — thinking modes, MCP servers, /commands, the lot.

Links:

  • Site: https://claudish.com

  • GitHub: https://github.com/MadAppGang/claude-code/tree/main/mcp/claudish

Open source, MIT license. Built by MadAppGang.

What models are people wanting to try with Claude Code's architecture? Curious what combos work well.

🌐
Builder.io
builder.io › blog › claude-code
How I use Claude Code (+ my best tips)
September 29, 2025 - Claude is also exceptionally good ... like that. It's honestly kind of incredible. Think about it: Cursor built a general-purpose agent that supports multiple models....
🌐
Claude
code.claude.com › docs › en › model-config
Model configuration - Claude Code Docs
At startup - Launch with claude --model <alias|name> Environment variable - Set ANTHROPIC_MODEL=<alias|name> Settings - Configure permanently in your settings file using the model field. Example usage: ... The behavior of default depends on your account type. For certain Max users, Claude Code will automatically fall back to Sonnet if you hit a usage threshold with Opus.
🌐
Z
docs.z.ai › devpack › tool › claude
Claude Code - Overview - Z.AI DEVELOPER DOCUMENT
Mapping between Claude Code internal model environment variables and GLM models, with the default configuration as follows: ... ANTHROPIC_DEFAULT_HAIKU_MODEL: GLM-4.5-Air If adjustments are needed, you can directly modify the configuration file (for example, ~/.claude/settings.json in Claude Code) to switch to other models.
Find elsewhere
🌐
Justin Searls
justin.searls.co › posts › how-to-run-claude-code-against-a-free-local-model
How to run Claude Code against a free local model
After onboarding, click the search icon (or hit Command-Shift-M) and install an appropriate model (I started with "Qwen2.5 Coder 14B", as it can fit comfortably in 48GB) Once downloaded, click the "My Models" icon in the sidebar (Command-3), then click the settings gear button and set the context length to 8192 (this is Anon Kode's default token limit and it currently doesn't seem to respect other values, so increasing the token limit in LM Studio to match is the easiest workaround)
🌐
ClaudeLog
claudelog.com › home › faqs › change model
How to Change Claude Code Model | ClaudeLog
Command Line Flag - Specify model when starting Claude Code: # Start with Sonnet 4.5 (recommended for main development) claude --model claude-sonnet-4-5-20250929 # Start with Haiku 4.5 (lightweight agents, 90% capability, 3x cost savings) claude --model claude-haiku-4-5 # Start with Opus 4.5 (state-of-the-art software engineering) claude --model claude-opus-4-5-20251101
🌐
GitHub
github.com › 1rgs › claude-code-proxy
GitHub - 1rgs/claude-code-proxy: Run Claude Code on OpenAI models
Use Anthropic clients (like Claude Code) with Gemini, OpenAI, or direct Anthropic backends. 🤝 · A proxy server that lets you use Anthropic clients with Gemini, OpenAI, or Anthropic models themselves (a transparent proxy of sorts), all via LiteLLM.
Starred by 2.7K users
Forked by 369 users
Languages   Python 99.7% | Dockerfile 0.3%
🌐
Reddit
reddit.com › r/claudecode › how to set up claude code with multiple ai models
r/ClaudeCode on Reddit: How to Set Up Claude Code with Multiple AI Models
November 20, 2025 -

This guide provides a lightweight approach to setting up your terminal, allowing you to easily switch between different AI models when using Claude Code.

What This Does

Instead of being limited to one AI model, you'll be able to run commands like:

  • claude - Uses the default Claude AI

  • claudekimi - Uses Kimi For Coding

  • claudeglm - Uses Z.AI's GLM models

  • claudem2 - Uses MiniMax M2

  • claude kimi or claude glm or claude m2 - Alternative way to switch models

Before You Start

You'll need:

  1. Claude Code installed on your computer (the CLI version)

  2. API keys for the AI services you want to use

  3. Access to your terminal configuration file (usually ~/.zshrc on Mac)

Step 1: Get Your API Keys

Sign up for accounts with the AI services you want to use and get your API keys:

  • Kimi For Coding: Get your key from Kimi's developer portal

  • Z.AI (for GLM models): Get your key from Z.AI

  • MiniMax: Get your key from MiniMax

Keep these keys somewhere safe - you'll need them in the next step.

Step 2: Open Your Terminal Configuration File

  1. Open Terminal

  2. Type: open ~/.zshrc

  3. This opens your configuration file in a text editor

Step 3: Add Your API Keys

Add these lines to your configuration file, replacing the placeholder text with your actual API keys:

# API Keys for different AI services
export KIMI_API_KEY="your-kimi-api-key-here"
export ZAI_API_KEY="your-zai-api-key-here"
export MINIMAX_API_KEY="your-minimax-api-key-here"

Step 4: Add the Model Configurations

Copy and paste these sections into your configuration file. These tell Claude Code how to connect to each AI service.

For Kimi For Coding:

claudekimi() {
  # Check if API key exists
  if [[ -z "$KIMI_API_KEY" ]]; then
    echo "Error: KIMI_API_KEY is not set. Please add it to ~/.zshrc."
    return 1
  fi

  # Clear any existing Anthropic key
  unset ANTHROPIC_API_KEY

  # Configure for Kimi
  export ANTHROPIC_BASE_URL="https://api.kimi.com/coding/"
  export ANTHROPIC_AUTH_TOKEN="$KIMI_API_KEY"
  export ANTHROPIC_MODEL="kimi-for-coding"
  export ANTHROPIC_SMALL_FAST_MODEL="kimi-for-coding"

  # Run Claude Code
  /Users/yourusername/.claude/local/claude "$@"
}

For Z.AI GLM Models:

claudeglm() {
  # Check if API key exists
  if [[ -z "$ZAI_API_KEY" ]]; then
    echo "Error: ZAI_API_KEY is not set. Please add it to ~/.zshrc."
    return 1
  fi

  # Clear any existing Anthropic key
  unset ANTHROPIC_API_KEY

  # Configure for Z.AI
  export ANTHROPIC_BASE_URL="https://api.z.ai/api/anthropic"
  export ANTHROPIC_AUTH_TOKEN="$ZAI_API_KEY"
  export ANTHROPIC_DEFAULT_OPUS_MODEL="glm-4.6"
  export ANTHROPIC_DEFAULT_SONNET_MODEL="glm-4.6"
  export ANTHROPIC_DEFAULT_HAIKU_MODEL="glm-4.5-air"

  # Run Claude Code
  /Users/yourusername/.claude/local/claude "$@"
}

For MiniMax M2:

claudem2() {
  # Check if API key exists
  if [ -z "$MINIMAX_API_KEY" ]; then
    echo "Error: MINIMAX_API_KEY is not set. Please add it to ~/.zshrc"
    return 1
  fi

  # Clear any existing Anthropic key
  unset ANTHROPIC_API_KEY

  # Configure for MiniMax
  export ANTHROPIC_BASE_URL="https://api.minimax.io/anthropic"
  export ANTHROPIC_AUTH_TOKEN="$MINIMAX_API_KEY"
  export API_TIMEOUT_MS="3000000"
  export CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC=1

  export ANTHROPIC_MODEL="MiniMax-M2"
  export ANTHROPIC_SMALL_FAST_MODEL="MiniMax-M2"
  export ANTHROPIC_DEFAULT_SONNET_MODEL="MiniMax-M2"
  export ANTHROPIC_DEFAULT_OPUS_MODEL="MiniMax-M2"
  export ANTHROPIC_DEFAULT_HAIKU_MODEL="MiniMax-M2"

  # Run Claude Code
  /Users/yourusername/.claude/local/claude "$@"
}

Optional: Add a Dispatcher Function

This lets you type claude kimi instead of claudekimi:

claude() {
  case "$1" in
    m2|M2|minimax)
      shift
      claudem2 "$@"
      ;;
    kimi|K2)
      shift
      claudekimi "$@"
      ;;
    glm|GLM)
      shift
      claudeglm "$@"
      ;;
    *)
      # Default to regular Claude
      /Users/yourusername/.claude/local/claude "$@"
      ;;
  esac
}

Step 5: Update the Path to Claude Code

In all the code above, you'll see /Users/yourusername/.claude/local/claude. You need to change this to match where Claude Code is installed on your computer.

To find the correct path:

  1. In Terminal, type: which claude

  2. Copy the path it shows

  3. Replace /Users/yourusername/.claude/local/claude with your path in all the functions above

Step 6: Reload Your Configuration

After saving your changes, tell your terminal to use the new configuration:

source ~/.zshrc

Step 7: Test It Out

Try running one of your new commands:

claudekimi

or

claude glm

If everything is set up correctly, Claude Code will launch using your chosen AI model!

Troubleshooting

"Command not found"

  • Make sure you reloaded your configuration with source ~/.zshrc

  • Check that the path to Claude Code is correct

"API key is not set"

  • Double-check that you added your API keys to ~/.zshrc

  • Make sure there are no typos in the variable names

  • Reload your configuration with source ~/.zshrc

"Connection error"

  • Verify your API key is valid and active

  • Check that you have internet connection

  • Make sure the API service URL is correct

How It Works (Optional Reading)

Each function you added does three things:

  1. Checks for the API key - Makes sure you've set it up

  2. Configures the connection - Tells Claude Code where to connect and which model to use

  3. Runs Claude Code - Launches the program with your settings

The dispatcher function (claude) is just a shortcut that looks at the first word you type and picks the right configuration automatically.

Adding More AI Models

Want to add another AI service? Follow this pattern:

  1. Get the API key and add it to your ~/.zshrc

  2. Create a new function (like claudenewservice)

  3. Set the ANTHROPIC_BASE_URL to the service's API endpoint

  4. Set the ANTHROPIC_AUTH_TOKEN to your API key

  5. Configure which models to use

  6. Add it to the dispatcher function if you want the claude shortcut

That's it! You now have a flexible setup that lets you switch between different AI models with simple commands. If you run a different shell, just ask Claude to make a version of this for your setup.

🌐
Hacker News
news.ycombinator.com › item
you can use any model with Claude code thanks to https://github.com/musistudio/c... | Hacker News
September 7, 2025 - but in my testing other models do not work well, looks like prompts are either very optimized for Claude, or other models are just not great yet with such agentic environment · I was especially disappointed with grok code. it is very fast as advertised but in generating spaces and new lines ...
🌐
Reddit
reddit.com › r/claudeai › running claude code with a local model or groq.
r/ClaudeAI on Reddit: Running Claude Code with a local model or groq.
March 9, 2025 -

I've been absolutely amazed by Claude Code, it's like travelling to the future.

But the price is insane, their claim of $100/day is not a lie, once you get going, the price can be crazy.

Has anyone figured out a way to get it to talk to a local model (and which would work well), or with the Groq API?

I tried searching Reddit and Google, and asking Perplexity, and asking OAI Deep Research, and so far nothing, so I don't hold much hope, but asking just in case.

Thanks!

🌐
Eesel AI
eesel.ai › blog › model-configuration-claude-code
A complete guide to model configuration in Claude Code - eesel AI
A complete guide to model configuration in Claude Code. Learn the differences between Opus, Sonnet, and Haiku, and explore advanced strategies and limitations.
🌐
Hacker News
news.ycombinator.com › item
Is there anything like Claude code for other models such as gemini? | Hacker News
May 6, 2025 - Drop me a line (see profile) if you're interested in beta testing it when it's out · Currently Claude Code is a big value-add for Claude. Google has nothing equivalent; aider requires far more manual work
🌐
Milvus
milvus.io › ai-quick-reference › is-claude-code-available-in-all-claude-models
Is Claude Code available in all Claude models?
For users who require different model characteristics for specific use cases, they can still access other Claude models through the regular web interface or API, while using Claude Code for intensive development work that benefits from the advanced ...
🌐
Reddit
reddit.com › r/claudecode › how to use claude code with gpt 5 models
r/ClaudeCode on Reddit: How to use Claude code with GPT 5 models
September 7, 2025 -

I fell down a rabbit hole after finding Z.ai. Loved Claude Code’s UX in the terminal, but I wanted OpenAI’s latest brains behind it. Instead of forking anything, I hacked a tiny middle layer that “speaks Anthropic” on one side and “speaks OpenAI” on the other.Big hurdles:

  • Endpoints didn’t match (404s).

  • Token limits were different (had to cap).

  • Tool calls were strict about IDs and ordering (got a bunch of “must follow tool_calls” errors until I mapped IDs perfectly).

  • Auth conflicts

Once the translation clicked, messages, tools, models, and logs - it just worked. Same CLI flow, different model under the hood. No code here because I’m packaging it up, but if you care about model choice without changing your tools, this is the move.