LLM · Adapter
llm-aigateway
via Vercel AI Gateway
Unified gateway for hosted models. Plain "provider/model" strings.
@openagentry/adapter-llm-aigateway npm → — · Install
Use it
# Install
$ npm install @openagentry/adapter-llm-aigateway
# Or in a workspace registered with the CLI
$ npx agentry init @openagentry/adapter-llm-aigateway — · Default export
Pattern: lazy
The default export is always a frozen LLMAdapter
with static category + id. Methods read env on
first call and cache the underlying instance. Missing config produces an
AgentryError at the use-site, not at import.
import default_ from '@openagentry/adapter-llm-aigateway';
// → llmAdapter — category: 'llm', id: 'llm-aigateway' — · Environment
Configuration
| Variable | Notes |
|---|---|
| AI_GATEWAY_API_KEY | Required. |
— · Reported capabilities
What the manifest says
agentry capabilities --json reports these flags from
the package's openagentry.capabilities manifest block.
- transport
- https
- streaming
- true
— · Failure modes
Error codes
6 stable codes this adapter throws. Each has a one-line resolution.
-
E_AIGATEWAY_AUTHVercel AI Gateway key missing or rejected. -
E_AIGATEWAY_NOT_FOUNDRequested model not found at the gateway. -
E_AIGATEWAY_RATE_LIMITAI Gateway rate limit reached. -
E_AIGATEWAY_NETWORKNetwork failure reaching the gateway. -
E_AIGATEWAY_UPSTREAMGateway returned a non-2xx response. -
E_AIGATEWAY_CONFIG_INVALIDAdapter config failed validation.