Videos
Got tired of being locked to Anthropic models in Claude Code. Built a proxy that lets you use 580+ models via OpenRouter while keeping the full Claude Code experience.
What it does:
Use Gemini, GPT, Grok, DeepSeek, Llama — whatever — inside Claude Code
Works with your existing Claude subscription (native passthrough, no markup)
Or run completely free using OpenRouter's free tier (actual good models, not garbage)
Multi-agent setup: map different models to opus/sonnet/haiku/subagent roles
Install:
npm install -g claudish claudish --free
That's it. No config.
How it works:
Sits between Claude Code and the API. Translates Anthropic's tool format to OpenAI/Gemini JSON and back. Zero patches to the Claude Code binary, so it doesn't break when Anthropic pushes updates.
Everything still works — thinking modes, MCP servers, /commands, the lot.
Links:
Site: https://claudish.com
GitHub: https://github.com/MadAppGang/claude-code/tree/main/mcp/claudish
Open source, MIT license. Built by MadAppGang.
What models are people wanting to try with Claude Code's architecture? Curious what combos work well.
This guide provides a lightweight approach to setting up your terminal, allowing you to easily switch between different AI models when using Claude Code.
What This Does
Instead of being limited to one AI model, you'll be able to run commands like:
claude- Uses the default Claude AIclaudekimi- Uses Kimi For Codingclaudeglm- Uses Z.AI's GLM modelsclaudem2- Uses MiniMax M2claude kimiorclaude glmorclaude m2- Alternative way to switch models
Before You Start
You'll need:
Claude Code installed on your computer (the CLI version)
API keys for the AI services you want to use
Access to your terminal configuration file (usually
~/.zshrcon Mac)
Step 1: Get Your API Keys
Sign up for accounts with the AI services you want to use and get your API keys:
Kimi For Coding: Get your key from Kimi's developer portal
Z.AI (for GLM models): Get your key from Z.AI
MiniMax: Get your key from MiniMax
Keep these keys somewhere safe - you'll need them in the next step.
Step 2: Open Your Terminal Configuration File
Open Terminal
Type:
open ~/.zshrcThis opens your configuration file in a text editor
Step 3: Add Your API Keys
Add these lines to your configuration file, replacing the placeholder text with your actual API keys:
# API Keys for different AI services export KIMI_API_KEY="your-kimi-api-key-here" export ZAI_API_KEY="your-zai-api-key-here" export MINIMAX_API_KEY="your-minimax-api-key-here"
Step 4: Add the Model Configurations
Copy and paste these sections into your configuration file. These tell Claude Code how to connect to each AI service.
For Kimi For Coding:
claudekimi() {
# Check if API key exists
if [[ -z "$KIMI_API_KEY" ]]; then
echo "Error: KIMI_API_KEY is not set. Please add it to ~/.zshrc."
return 1
fi
# Clear any existing Anthropic key
unset ANTHROPIC_API_KEY
# Configure for Kimi
export ANTHROPIC_BASE_URL="https://api.kimi.com/coding/"
export ANTHROPIC_AUTH_TOKEN="$KIMI_API_KEY"
export ANTHROPIC_MODEL="kimi-for-coding"
export ANTHROPIC_SMALL_FAST_MODEL="kimi-for-coding"
# Run Claude Code
/Users/yourusername/.claude/local/claude "$@"
}For Z.AI GLM Models:
claudeglm() {
# Check if API key exists
if [[ -z "$ZAI_API_KEY" ]]; then
echo "Error: ZAI_API_KEY is not set. Please add it to ~/.zshrc."
return 1
fi
# Clear any existing Anthropic key
unset ANTHROPIC_API_KEY
# Configure for Z.AI
export ANTHROPIC_BASE_URL="https://api.z.ai/api/anthropic"
export ANTHROPIC_AUTH_TOKEN="$ZAI_API_KEY"
export ANTHROPIC_DEFAULT_OPUS_MODEL="glm-4.6"
export ANTHROPIC_DEFAULT_SONNET_MODEL="glm-4.6"
export ANTHROPIC_DEFAULT_HAIKU_MODEL="glm-4.5-air"
# Run Claude Code
/Users/yourusername/.claude/local/claude "$@"
}For MiniMax M2:
claudem2() {
# Check if API key exists
if [ -z "$MINIMAX_API_KEY" ]; then
echo "Error: MINIMAX_API_KEY is not set. Please add it to ~/.zshrc"
return 1
fi
# Clear any existing Anthropic key
unset ANTHROPIC_API_KEY
# Configure for MiniMax
export ANTHROPIC_BASE_URL="https://api.minimax.io/anthropic"
export ANTHROPIC_AUTH_TOKEN="$MINIMAX_API_KEY"
export API_TIMEOUT_MS="3000000"
export CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC=1
export ANTHROPIC_MODEL="MiniMax-M2"
export ANTHROPIC_SMALL_FAST_MODEL="MiniMax-M2"
export ANTHROPIC_DEFAULT_SONNET_MODEL="MiniMax-M2"
export ANTHROPIC_DEFAULT_OPUS_MODEL="MiniMax-M2"
export ANTHROPIC_DEFAULT_HAIKU_MODEL="MiniMax-M2"
# Run Claude Code
/Users/yourusername/.claude/local/claude "$@"
}Optional: Add a Dispatcher Function
This lets you type claude kimi instead of claudekimi:
claude() {
case "$1" in
m2|M2|minimax)
shift
claudem2 "$@"
;;
kimi|K2)
shift
claudekimi "$@"
;;
glm|GLM)
shift
claudeglm "$@"
;;
*)
# Default to regular Claude
/Users/yourusername/.claude/local/claude "$@"
;;
esac
}Step 5: Update the Path to Claude Code
In all the code above, you'll see /Users/yourusername/.claude/local/claude. You need to change this to match where Claude Code is installed on your computer.
To find the correct path:
In Terminal, type:
which claudeCopy the path it shows
Replace
/Users/yourusername/.claude/local/claudewith your path in all the functions above
Step 6: Reload Your Configuration
After saving your changes, tell your terminal to use the new configuration:
source ~/.zshrc
Step 7: Test It Out
Try running one of your new commands:
claudekimi
or
claude glm
If everything is set up correctly, Claude Code will launch using your chosen AI model!
Troubleshooting
"Command not found"
Make sure you reloaded your configuration with
source ~/.zshrcCheck that the path to Claude Code is correct
"API key is not set"
Double-check that you added your API keys to
~/.zshrcMake sure there are no typos in the variable names
Reload your configuration with
source ~/.zshrc
"Connection error"
Verify your API key is valid and active
Check that you have internet connection
Make sure the API service URL is correct
How It Works (Optional Reading)
Each function you added does three things:
Checks for the API key - Makes sure you've set it up
Configures the connection - Tells Claude Code where to connect and which model to use
Runs Claude Code - Launches the program with your settings
The dispatcher function (claude) is just a shortcut that looks at the first word you type and picks the right configuration automatically.
Adding More AI Models
Want to add another AI service? Follow this pattern:
Get the API key and add it to your
~/.zshrcCreate a new function (like
claudenewservice)Set the
ANTHROPIC_BASE_URLto the service's API endpointSet the
ANTHROPIC_AUTH_TOKENto your API keyConfigure which models to use
Add it to the dispatcher function if you want the
claudeshortcut
That's it! You now have a flexible setup that lets you switch between different AI models with simple commands. If you run a different shell, just ask Claude to make a version of this for your setup.
I've been absolutely amazed by Claude Code, it's like travelling to the future.
But the price is insane, their claim of $100/day is not a lie, once you get going, the price can be crazy.
Has anyone figured out a way to get it to talk to a local model (and which would work well), or with the Groq API?
I tried searching Reddit and Google, and asking Perplexity, and asking OAI Deep Research, and so far nothing, so I don't hold much hope, but asking just in case.
Thanks!
I fell down a rabbit hole after finding Z.ai. Loved Claude Code’s UX in the terminal, but I wanted OpenAI’s latest brains behind it. Instead of forking anything, I hacked a tiny middle layer that “speaks Anthropic” on one side and “speaks OpenAI” on the other.Big hurdles:
Endpoints didn’t match (404s).
Token limits were different (had to cap).
Tool calls were strict about IDs and ordering (got a bunch of “must follow tool_calls” errors until I mapped IDs perfectly).
Auth conflicts
Once the translation clicked, messages, tools, models, and logs - it just worked. Same CLI flow, different model under the hood. No code here because I’m packaging it up, but if you care about model choice without changing your tools, this is the move.