Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
108 changes: 108 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@

- **Model Routing**: Route requests to different models based on your needs (e.g., background tasks, thinking, long context).
- **Multi-Provider Support**: Supports various model providers like OpenRouter, DeepSeek, Ollama, Gemini, Volcengine, and SiliconFlow.
- **Multiple Instances**: Run unlimited CCR instances simultaneously with different configurations.
- **Request/Response Transformation**: Customize requests and responses for different providers using transformers.
- **Dynamic Model Switching**: Switch models on-the-fly within Claude Code using the `/model` command.
- **CLI Model Management**: Manage models and providers directly from the terminal with `ccr model`.
Expand Down Expand Up @@ -413,6 +414,56 @@ The `Router` object defines which model to use for different scenarios:
`/model provider_name,model_name`
Example: `/model openrouter,anthropic/claude-3.5-sonnet`

#### Configuration Examples

This repository includes several ready-to-use configuration examples in the root directory:

- **`config.example.json`** - Comprehensive example with all major providers (OpenRouter, DeepSeek, Gemini, Ollama, Groq, Copilot API)
- **`config.copilot.example.json`** - GitHub Copilot API integration example (see below)
- **`config.multimodel.example.json`** - Multi-model strategy for cost optimization

Copy and customize these examples to `~/.claude-code-router/config.json` to get started quickly.

#### GitHub Copilot API Integration

Claude Code Router supports integration with [copilot-api](https://github.com/ericc-ch/copilot-api), a local API wrapper that provides access to multiple AI models through a unified interface.

**Setup Steps:**

1. Install and start copilot-api (default port: 4141)
2. Configure CCR to use the Copilot API endpoint:

```json
{
"Providers": [
{
"name": "copilot",
"api_base_url": "http://localhost:4141/v1/messages",
"api_key": "dummy-key-for-local",
"models": [
"claude-sonnet-4.5",
"gpt-5.1",
"gpt-5.1-codex"
],
"transformer": {
"use": ["anthropic"]
}
}
],
"Router": {
"default": "copilot,claude-sonnet-4.5",
"think": "copilot,gpt-5.1"
}
}
```

**Important:** Use the `anthropic` transformer for proper message format compatibility. See `config.copilot.example.json` for a complete working example.

**Working Models:** Claude Sonnet 4.5, GPT-5.1, GPT-5.1-Codex
**Known Issues:** Gemini 3 Pro Preview may have format incompatibility

See [Issue #1021](https://github.com/musistudio/claude-code-router/issues/1021) for more details.

#### Custom Router

For more advanced routing logic, you can specify a custom router script via the `CUSTOM_ROUTER_PATH` in your `config.json`. This allows you to implement complex routing rules beyond the default scenarios.
Expand Down Expand Up @@ -463,6 +514,63 @@ For routing within subagents, you must specify a particular provider and model b
Please help me analyze this code snippet for potential optimizations...
```

## 🔀 Multiple Instances Support

Run multiple CCR instances simultaneously with different configurations! Perfect for:
- Different projects requiring different model setups
- Testing configurations without affecting your main setup
- Separating work contexts (e.g., personal vs work)

### Usage

**Start a custom instance:**
```bash
ccr start --config /path/to/custom-config.json
```

The port will be auto-allocated if the specified port is busy. Each instance runs independently with its own PID and configuration.

**Use a specific instance:**
```bash
ccr code --config /path/to/custom-config.json "your task"
```

**Check all running instances:**
```bash
ccr status # Show all instances
ccr status --config /path/to/config.json # Show specific instance
```

**Stop a specific instance:**
```bash
ccr stop --config /path/to/custom-config.json
```

### Example Scenario

```bash
# Default instance (production setup)
ccr start
# -> Runs on port 3456

# Project-specific instance with different models
ccr start --config ~/project-a/ccr-config.json
# -> Auto-allocates port 3457

# Another project with local models
ccr start --config ~/project-b/ccr-config.json
# -> Auto-allocates port 3458

# View all running instances
ccr status
# Shows: Default + 2 custom instances

# Use specific instance
ccr code --config ~/project-a/ccr-config.json "implement feature X"
```

See `config.multiinstance.example.json` for a complete example configuration.

## Status Line (Beta)
To better monitor the status of claude-code-router at runtime, version v1.0.40 includes a built-in statusline tool, which you can enable in the UI.
![statusline-config.png](/blog/images/statusline-config.png)
Expand Down
55 changes: 55 additions & 0 deletions config.copilot.example.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
{
"// GITHUB COPILOT API INTEGRATION": "Example configuration for copilot-api",
"// Prerequisites": "1. Install copilot-api: https://github.com/ericc-ch/copilot-api",
"// ": "2. Start copilot-api server (default port 4141)",
"// ": "3. Use this config with CCR",

"LOG": true,
"PORT": 3456,
"APIKEY": "your-api-key",

"Providers": [
{
"name": "copilot",
"api_base_url": "http://localhost:4141/v1/messages",
"api_key": "dummy-key-for-local-copilot",
"models": [
"claude-sonnet-4.5",
"gpt-5.1",
"gpt-5.1-codex"
],
"// IMPORTANT": "Use anthropic transformer for proper message format",
"transformer": {
"use": ["anthropic"]
}
},
{
"// Fallback provider": "In case copilot-api is down",
"name": "gemini",
"api_base_url": "https://generativelanguage.googleapis.com/v1beta/models/",
"api_key": "YOUR_GEMINI_API_KEY",
"models": [
"gemini-2.5-flash",
"gemini-2.5-pro"
],
"transformer": {
"use": ["gemini"]
}
}
],

"Router": {
"default": "copilot,claude-sonnet-4.5",
"background": "copilot,gpt-5.1",
"think": "copilot,gpt-5.1",
"longContext": "gemini,gemini-2.5-pro",
"webSearch": "gemini,gemini-2.5-flash"
},

"// NOTES": {
"working_models": "Claude Sonnet 4.5, GPT-5.1, GPT-5.1-Codex work well",
"known_issues": "Gemini 3 Pro Preview may have format incompatibility",
"solution": "Use anthropic transformer for all Copilot models",
"reference": "https://github.com/musistudio/claude-code-router/issues/1021"
}
}
134 changes: 134 additions & 0 deletions config.example.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,134 @@
{
"// BASIC CONFIGURATION": "Claude Code Router - Example Configuration",
"// Copy this file to ~/.claude-code-router/config.json and customize it": "",

"LOG": true,
"LOG_LEVEL": "info",
"PORT": 3456,
"HOST": "127.0.0.1",
"APIKEY": "your-api-key-here",
"API_TIMEOUT_MS": 600000,

"// PROVIDERS CONFIGURATION": "Configure your LLM providers below",
"Providers": [
{
"// OpenRouter": "Route to multiple models through OpenRouter",
"name": "openrouter",
"api_base_url": "https://openrouter.ai/api/v1/chat/completions",
"api_key": "sk-or-v1-YOUR_KEY_HERE",
"models": [
"anthropic/claude-sonnet-4",
"anthropic/claude-3.5-sonnet",
"google/gemini-2.5-pro-preview",
"deepseek/deepseek-chat-v3-0324"
],
"transformer": {
"use": ["openrouter"],
"// Model-specific transformers": "Add anthropic transformer for Claude models",
"anthropic/claude-sonnet-4": {
"use": ["anthropic", "openrouter"]
},
"anthropic/claude-3.5-sonnet": {
"use": ["anthropic", "openrouter"]
}
}
},
{
"// DeepSeek": "Direct DeepSeek API",
"name": "deepseek",
"api_base_url": "https://api.deepseek.com/chat/completions",
"api_key": "sk-YOUR_KEY_HERE",
"models": [
"deepseek-chat",
"deepseek-reasoner"
],
"transformer": {
"use": ["deepseek"],
"deepseek-chat": {
"use": ["tooluse"]
}
}
},
{
"// Google Gemini": "Direct Gemini API",
"name": "gemini",
"api_base_url": "https://generativelanguage.googleapis.com/v1beta/models/",
"api_key": "YOUR_GEMINI_API_KEY",
"models": [
"gemini-2.5-flash",
"gemini-2.5-pro"
],
"transformer": {
"use": ["gemini"]
}
},
{
"// Ollama": "Local models via Ollama",
"name": "ollama",
"api_base_url": "http://localhost:11434/v1/chat/completions",
"api_key": "ollama",
"models": [
"qwen2.5-coder:latest",
"deepseek-coder-v2:latest"
]
},
{
"// GitHub Copilot API": "Use copilot-api for multiple models",
"// See": "https://github.com/ericc-ch/copilot-api",
"name": "copilot",
"api_base_url": "http://localhost:4141/v1/messages",
"api_key": "dummy-key-for-local",
"models": [
"claude-sonnet-4.5",
"gpt-5.1",
"gemini-3-pro-preview"
],
"transformer": {
"use": ["anthropic"]
}
},
{
"// Groq": "Fast inference with Groq",
"name": "groq",
"api_base_url": "https://api.groq.com/openai/v1/chat/completions",
"api_key": "gsk-YOUR_KEY_HERE",
"models": [
"llama-3.3-70b-versatile",
"deepseek-r1-distill-llama-70b"
],
"transformer": {
"use": ["groq"]
}
}
],

"// ROUTER CONFIGURATION": "Define which models to use for different scenarios",
"Router": {
"default": "openrouter,anthropic/claude-sonnet-4",
"background": "gemini,gemini-2.5-flash",
"think": "deepseek,deepseek-reasoner",
"longContext": "gemini,gemini-2.5-pro",
"longContextThreshold": 60000,
"webSearch": "openrouter,google/gemini-2.5-pro-preview",
"image": "gemini,gemini-2.5-flash"
},

"// IMAGE AGENT": "Force image agent for models without tool calling support",
"forceUseImageAgent": false,

"// STATUS LINE": "Custom status line configuration",
"StatusLine": {
"enabled": true,
"format": "[{sessionId}] {model} | Tokens: {tokens}"
},

"// ADVANCED": "Advanced transformer configurations",
"transformers": [
{
"path": "~/.claude-code-router/plugins/custom-transformer.js",
"options": {
"customOption": "value"
}
}
]
}
55 changes: 55 additions & 0 deletions config.multiinstance.example.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
{
"// MULTI-INSTANCE EXAMPLE": "Run multiple CCR instances simultaneously",
"// Use Case": "Different models for different projects or workflows",
"// This is a sample custom config - save it somewhere like ~/my-project/ccr-config.json",

"LOG": true,
"PORT": 3457,
"APIKEY": "custom-instance-key",

"// CUSTOM INSTANCE SETUP": "This instance uses OpenRouter for main tasks",
"Providers": [
{
"name": "openrouter",
"api_base_url": "https://openrouter.ai/api/v1/chat/completions",
"api_key": "sk-or-v1-YOUR_KEY",
"models": [
"anthropic/claude-sonnet-4",
"google/gemini-2.5-pro-preview"
],
"transformer": {
"use": ["openrouter"],
"anthropic/claude-sonnet-4": {
"use": ["anthropic", "openrouter"]
}
}
},
{
"name": "ollama",
"api_base_url": "http://localhost:11434/v1/chat/completions",
"api_key": "ollama",
"models": ["qwen2.5-coder:latest"]
}
],

"Router": {
"default": "openrouter,anthropic/claude-sonnet-4",
"background": "ollama,qwen2.5-coder:latest",
"think": "openrouter,google/gemini-2.5-pro-preview"
},

"// USAGE": {
"start_instance": "ccr start --config ~/my-project/ccr-config.json",
"use_instance": "ccr code --config ~/my-project/ccr-config.json 'your task'",
"check_status": "ccr status --config ~/my-project/ccr-config.json",
"stop_instance": "ccr stop --config ~/my-project/ccr-config.json",
"view_all": "ccr status # Shows all running instances"
},

"// NOTES": {
"port_allocation": "Ports are auto-allocated if specified port is in use",
"default_instance": "Default config at ~/.claude-code-router/config.json runs independently",
"simultaneous": "You can run unlimited instances with different configs",
"isolation": "Each instance has its own PID, port, and configuration"
}
}
Loading