LLM · Adapter
llm-openrouter
via OpenRouter
Hundreds of hosted models behind one OpenAI-compatible API.
@openagentry/adapter-llm-openrouter npm → — · Install
Use it
# Install
$ npm install @openagentry/adapter-llm-openrouter
# Or in a workspace registered with the CLI
$ npx agentry init @openagentry/adapter-llm-openrouter — · Default export
Pattern: lazy
The default export is always a frozen LLMAdapter
with static category + id. Methods read env on
first call and cache the underlying instance. Missing config produces an
AgentryError at the use-site, not at import.
import default_ from '@openagentry/adapter-llm-openrouter';
// → llmAdapter — category: 'llm', id: 'llm-openrouter' — · Environment
Configuration
| Variable | Notes |
|---|---|
| OPENROUTER_API_KEY | Required. |
— · Reported capabilities
What the manifest says
agentry capabilities --json reports these flags from
the package's openagentry.capabilities manifest block.
- transport
- https
- streaming
- true
- manyProviders
- true
— · Failure modes
Error codes
6 stable codes this adapter throws. Each has a one-line resolution.
-
E_OPENROUTER_AUTHOPENROUTER_API_KEY missing or rejected. -
E_OPENROUTER_NOT_FOUNDModel not found on OpenRouter. -
E_OPENROUTER_RATE_LIMITOpenRouter rate limit reached. -
E_OPENROUTER_NETWORKNetwork failure reaching OpenRouter. -
E_OPENROUTER_UPSTREAMOpenRouter returned a non-2xx response. -
E_OPENROUTER_CONFIG_INVALIDAdapter config failed validation.