OpenRouter provides access to a wide variety of LLM models through a single API. It is one of two supported LLM providers (the other being Groq).
OpenRouter is used automatically when:
sk-or-v1-)Primary Model: The main LLM model to use (default: stepfun/step-3.5-flash:free)
Fallback Models (up to 2):
Popular free models available on OpenRouter:
stepfun/step-3.5-flash:freenvidia/nemotron-3-super-120b-a12b:freeqwen/qwen3.6-plus-preview:freeBoth streaming and non-streaming completions are supported. Streaming provides real-time character-by-character text feedback for a more natural conversation flow.
The API requests structured JSON responses for chat. Some models don't support JSON response format and are automatically handled:
stepfun/step-3.5-flash:freestepfun/step-1-flashOpenRouter requests include standard headers for tracking:
HTTP-Referer: https://waifuai.comX-OpenRouter-Title: Waifu AIX-Title: Waifu AIX-OpenRouter-Categories: character-chatResponse metadata is logged (when debug is enabled):
The OpenRouter integration is implemented as a standalone module (openrouter.js) with a consistent interface:
window.OpenRouterAPI.isConfigured() // Returns true if API key is set
window.OpenRouterAPI.createCompletion({ messages, json })
window.OpenRouterAPI.createCompletionStream({ messages, json })
See troubleshooting.html for general troubleshooting.