OpenRouter Integration

OpenRouter provides access to a wide variety of LLM models through a single API. It is one of two supported LLM providers (the other being Groq).

Enabling OpenRouter

  1. Open the Settings panel (click the gear icon)
  2. Navigate to LLM / OpenRouter in the AI & Language category
  3. Enter your API key and model name

OpenRouter is used automatically when:

Configuration Options

API Key

Model Selection

Primary Model: The main LLM model to use (default: stepfun/step-3.5-flash:free)

Fallback Models (up to 2):

Example Free Models

Popular free models available on OpenRouter:

Features

Streaming Mode

Both streaming and non-streaming completions are supported. Streaming provides real-time character-by-character text feedback for a more natural conversation flow.

JSON Mode

The API requests structured JSON responses for chat. Some models don't support JSON response format and are automatically handled:

Request Headers

OpenRouter requests include standard headers for tracking:

Usage Tracking

Response metadata is logged (when debug is enabled):

API Wrapper

The OpenRouter integration is implemented as a standalone module (openrouter.js) with a consistent interface:

window.OpenRouterAPI.isConfigured()    // Returns true if API key is set
window.OpenRouterAPI.createCompletion({ messages, json })
window.OpenRouterAPI.createCompletionStream({ messages, json })

Troubleshooting

See troubleshooting.html for general troubleshooting.