🌐
OpenRouter
openrouter.ai › docs › quickstart
OpenRouter Quickstart Guide | Developer Documentation | OpenRouter | Documentation
OpenRouter provides a unified API that gives you access to hundreds of AI models through a single endpoint, while automatically handling fallbacks and selecting the most cost-effective options. Get started with just a few lines of code using your preferred SDK or framework. Looking for information about free models and rate limits? Please see the FAQ · In the examples below, the OpenRouter-specific headers are optional.
🌐
OpenRouter
openrouter.ai › docs › api › reference › overview
OpenRouter API Reference | Complete API Documentation | OpenRouter | Documentation
At a high level, OpenRouter normalizes the schema across models and providers so you only need to learn one. Here is the request schema as a TypeScript type. This will be the body of your POST request to the /api/v1/chat/completions endpoint (see the quick start above for an example).
🌐
DataCamp
datacamp.com › tutorial › openrouter
OpenRouter: A Guide With Practical Examples | DataCamp
August 14, 2025 - First, the base_url parameter redirects your requests to OpenRouter's servers instead of OpenAI's. Second, the model name follows a provider/model-name format - openai/gpt-5-mini instead of just gpt-5-mini. This tells OpenRouter which provider's version you want while keeping the familiar interface. Here are some common models that you can plug into the above example without any errors:
🌐
OpenRouter
openrouter.ai › docs › api › reference › authentication
API Authentication | OpenRouter OAuth and API Keys | OpenRouter | Documentation
If you’re using the OpenAI Typescript SDK, set the api_base to https://openrouter.ai/api/v1 and the apiKey to your API key. TypeScript SDKPython (OpenAI SDK)TypeScript (OpenAI SDK)TypeScript (Raw API)cURL · To stream with Python, see this ...
🌐
Relevance AI
relevanceai.com › llm-models › set-up-and-use-openrouter-auto-llm-for-ai-applications
OpenRouter Auto llm - Relevance AI
At its core, OpenRouter provides an OpenAI-compatible completion API that developers can leverage either through direct calls or by utilizing the OpenAI SDK. The base URL for all API interactions is 'https://openrouter.ai/api/v1'. When making requests, developers can enhance their application's ...
🌐
LibreChat
librechat.ai › docs › configuration › librechat_yaml › ai_endpoints › openrouter
Openrouter
# recommended environment variables: apiKey: "${OPENROUTER_KEY}" # NOT OPENROUTER_API_KEY baseURL: "https://openrouter.ai/api/v1" models: default: ["meta-llama/llama-3-70b-instruct"] fetch: true titleConvo: true titleModel: "meta-llama/llam...
🌐
Cline
docs.cline.bot › provider-config › openrouter
OpenRouter - Cline
Enter API Key: Paste your OpenRouter API key into the “OpenRouter API Key” field. Select Model: Choose your desired model from the “Model” dropdown. (Optional) Custom Base URL: If you need to use a custom base URL for the OpenRouter API, check “Use custom base URL” and enter the URL.
🌐
Roo Code
docs.roocode.com › openrouter
Using OpenRouter With Roo Code | Roo Code Documentation
Enter API Key: Paste your OpenRouter API key into the "OpenRouter API Key" field. Select Model: Choose your desired model from the "Model" dropdown. (Optional) Custom Base URL: If you need to use a custom base URL for the OpenRouter API, check "Use custom base URL" and enter the URL.
🌐
OpenRouter
openrouter.ai › docs › use-cases › oauth-pkce
OAuth PKCE | Secure Authentication for OpenRouter | OpenRouter ...
To start the PKCE flow, send your user to OpenRouter’s /auth URL with a callback_url parameter pointing back to your site:
Find elsewhere
🌐
Kilo
kilo.ai › openrouter
Using OpenRouter With Kilo Code | Kilo Code Docs
Enter API Key: Paste your OpenRouter API key into the "OpenRouter API Key" field. Select Model: Choose your desired model from the "Model" dropdown. (Optional) Custom Base URL: If you need to use a custom base URL for the OpenRouter API, check "Use custom base URL" and enter the URL.
🌐
OpenRouter
openrouter.ai › docs › guides › overview › multimodal › images
OpenRouter Image Inputs | Complete Documentation | OpenRouter | Documentation
The image_url can either be a URL or a base64-encoded image. Note that multiple images can be sent in separate content array entries. The number of images you can send in a single request varies per provider and per model. Due to how the content is parsed, we recommend sending the text prompt first, then the images. If the images must come first, we recommend putting it in the system prompt. OpenRouter supports both direct URLs and base64-encoded data for images:
🌐
OpenRouter
openrouter.ai
OpenRouter
250k+ apps using OpenRouter with 4.2M+ users globallyView all
🌐
OpenRouter
openrouter.ai › docs › faq
OpenRouter FAQ | Developer Documentation | OpenRouter | Documentation
OpenRouter implements the OpenAI API specification for /completions and /chat/completions endpoints, allowing you to use any model with the same request/response format. Additional endpoints like /api/v1/models are also available. See our API documentation for detailed specifications. The API supports text, images, and PDFs. Images can be passed as URLs or base64 encoded images.
🌐
Reddit
reddit.com › r/chatgptcoding › eli5: how does openrouter work?
r/ChatGPTCoding on Reddit: ELI5: how does Openrouter work?
September 10, 2024 -

https://openrouter.ai/

How does it work? Is it spammy/legit? I only ask because with all my recent comments about my workflow and tools I use, I have been getting unsolicited DMs, inviting me to "join, we have room". Just seems spammy to me.

My bill this month for ChatGPT Pro + API, Claude Sonnet + API, and Cursor will probably be over $60 easy. I'm okay with that.

BUT if this OpenRouter service is cheaper? why not, right?

I just don't get it.

ELI5?

🌐
LiteLLM
docs.litellm.ai › supported models & providers › openrouter
OpenRouter | liteLLM
For production environments, you can dynamically configure the base_url using environment variables: import os from litellm import completion # Configure with environment variables OPENROUTER_API_KEY = os.getenv("OPENROUTER_API_KEY") OPENROUTER_BASE_URL = os.getenv("OPENROUTER_API_BASE", "https://openrouter.ai/api/v1") # Set environment for LiteLLM os.environ["OPENROUTER_API_KEY"] = OPENROUTER_API_KEY os.environ["OPENROUTER_API_BASE"] = OPENROUTER_BASE_URL response = completion( model="openrouter/google/palm-2-chat-bison", messages=messages, base_url=OPENROUTER_BASE_URL # Explicitly pass base_url for clarity )
🌐
OpenRouter
openrouter.ai › docs › use-cases › byok
BYOK | Use Your Own Provider Keys with OpenRouter | OpenRouter | Documentation
Make sure to append /chat/completions to the base URL. You can read more in the Azure Foundry documentation. api_key: In the same “Overview” section of your Azure AI Services resource, you can find your API key under “Keys and Endpoint”. model_id: This is the name of your model deployment in Azure AI Services. model_slug: This is the OpenRouter model identifier you want to use this key for.
🌐
Cloudflare
developers.cloudflare.com › directory › ai gateway › using ai gateway › provider native › openrouter
OpenRouter · Cloudflare AI Gateway docs
When making requests to OpenRouter ↗, replace https://openrouter.ai/api/v1/chat/completions in the URL you are currently using with https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/openrouter/chat/completions.
🌐
GitHub
github.com › cline › cline › pull › 5636
feat(openrouter): Support Custom Base URL for OpenRouter API by xiaoguomeiyitian · Pull Request #5636 · cline/cline
Uses BaseUrlField component to handle custom URL input. This description was created by for a0bfaef. You can customize this summary. It will automatically update as commits are pushed. feat(openrouter): Allow custom base URL for OpenRouter …
Author   cline
🌐
OpenRouter
openrouter.ai › docs › api › reference › parameters
API Parameters | Configure OpenRouter API Requests | OpenRouter | Documentation
OpenRouter will default to the values listed below if certain parameters are absent from your request (for example, temperature to 1.0).