openagentry

LLM · Adapter

llm-aigateway

via Vercel AI Gateway

Unified gateway for hosted models. Plain "provider/model" strings.

@openagentry/adapter-llm-aigateway npm →

— · Install

Use it

# Install
$ npm install @openagentry/adapter-llm-aigateway

# Or in a workspace registered with the CLI
$ npx agentry init @openagentry/adapter-llm-aigateway

— · Default export

Pattern: lazy

The default export is always a frozen LLMAdapter with static category + id. Methods read env on first call and cache the underlying instance. Missing config produces an AgentryError at the use-site, not at import.

import default_ from '@openagentry/adapter-llm-aigateway';
// → llmAdapter — category: 'llm', id: 'llm-aigateway'

— · Environment

Configuration

Variable Notes
AI_GATEWAY_API_KEY Required.

— · Reported capabilities

What the manifest says

agentry capabilities --json reports these flags from the package's openagentry.capabilities manifest block.

transport
https
streaming
true

— · Failure modes

Error codes

6 stable codes this adapter throws. Each has a one-line resolution.