Skip to content

Make create-onchain-agent templates LLM provider-agnostic. Closes #1106#1108

Open
0xshae wants to merge 4 commits intocoinbase:mainfrom
0xshae:multi-model-support
Open

Make create-onchain-agent templates LLM provider-agnostic. Closes #1106#1108
0xshae wants to merge 4 commits intocoinbase:mainfrom
0xshae:multi-model-support

Conversation

@0xshae
Copy link
Copy Markdown

@0xshae 0xshae commented Apr 14, 2026

Description

Make create-onchain-agent templates LLM provider-agnostic.

Closes #1106

The templates currently hardcode openai("gpt-4o-mini") and OPENAI_API_KEY. This change replaces the hardcoded singleton with createOpenAI({ apiKey, baseURL }) / new ChatOpenAI({ apiKey, configuration: { baseURL } }), configured via environment variables.

Since all major LLM providers now support the OpenAI-compatible API format, this lets developers configure any provider seamlessly via .env.local without needing to edit the scaffolded codebase.

Env vars added:

  • AI_API_KEY (Falls back to OPENAI_API_KEY)
  • AI_BASE_URL (Provider's endpoint, defaults to OpenAI)
  • AI_MODEL (Defaults to gpt-4o-mini)

Files changed:

  • templates/createAgent/framework/vercelAISDK/createAgent.ts
  • templates/createAgent/framework/langchain/createAgent.ts
  • templates/next/.../vercel-ai-sdk/create-agent.ts
  • templates/next/.../langchain/create-agent.ts
  • src/common/utils.ts (Generates .env.local containing universal provider examples)
  • Zero new dependencies added to package.json

Tests

Since these changes modify the scaffolding templates and not runtime code, I tested by generating a new template, replacing the code with my modified templates, and routing requests to OpenRouter.

Chatbot: test-agent-vercel (Vercel AI SDK Next.js Scaffold)
Network: Base Sepolia
Setup: Configured AI_BASE_URL to OpenRouter and AI_MODEL to google/gemini-2.5-flash. Fauceted with test ETH.

Prompt: what can you do?

-------------------
I can perform a variety of tasks onchain, including:

Wrap and unwrap ETH to WETH.
Fetch cryptocurrency prices from Pyth (e.g., BTC, ETH, SOL).
Get your wallet details, including address, network, and balances.
Transfer native tokens (like ETH or SOL) to another address.
Interact with ERC20 tokens:
Get token balances.
Transfer tokens.
Approve token spending for other addresses.
Get allowance amounts for token spending.
Get ERC20 token addresses for common symbols.
Request test funds from a faucet on 'base-sepolia' or 'solana-devnet'.
Manage smart wallet spend permissions:
List existing spend permissions.
Use a spend permission to spend tokens on behalf of a smart account.
Swap tokens using the CDP Swap API.
Interact with X402 services:
Discover available X402 services.
Make HTTP requests to X402 endpoints, handling payments if required.
What would you like me to do?
-------------------

Checklist

A couple of things to include in your PR for completeness:

  • Added documentation to all relevant README.md files
  • Added a changelog entry

0xshae added 4 commits April 8, 2026 18:22
replace hardcoded OpenAI imports and OPENAI_API_KEY with configurable
env vars (AI_API_KEY, AI_PROVIDER_URL, AI_MODEL) that support any
OpenAI-compatible provider including OpenRouter, Groq, and Together.

both Vercel AI SDK and Langchain templates now use createOpenAI/ChatOpenAI
with configurable baseURL. backward compatible with existing OPENAI_API_KEY.
…c and google support

add getModel/getLLM factory pattern that supports four provider modes:
- openai: native OpenAI API (default)
- anthropic: native Anthropic/Claude API
- google: native Google Gemini API
- custom: any OpenAI-compatible API (OpenRouter, Groq, Ollama, etc.)

controlled via env vars: AI_PROVIDER, AI_API_KEY, AI_MODEL, AI_PROVIDER_URL.
adds @ai-sdk/anthropic, @ai-sdk/google, @langchain/anthropic,
@langchain/google-genai as template dependencies.
…nfig

leverage the fact that all major LLM providers (Anthropic, Google,
OpenRouter, Groq, Ollama) now support the OpenAI chat completions
API format. instead of importing each provider's SDK, use the existing
createOpenAI/ChatOpenAI with a configurable baseURL.

this means:
- zero new dependencies (uses @ai-sdk/openai and @langchain/openai already installed)
- zero factory pattern or switch statements
- switch providers by setting AI_BASE_URL in .env
- backward compatible with OPENAI_API_KEY

env vars: AI_API_KEY, AI_BASE_URL, AI_MODEL
@0xshae 0xshae requested a review from murrlincoln as a code owner April 14, 2026 23:33
@cb-heimdall
Copy link
Copy Markdown

cb-heimdall commented Apr 14, 2026

🟡 Heimdall Review Status

Requirement Status More Info
Reviews 🟡 0/1
Denominator calculation
Show calculation
1 if user is bot 0
1 if user is external 0
2 if repo is sensitive 0
From .codeflow.yml 1
Additional review requirements
Show calculation
Max 0
0
From CODEOWNERS 0
Global minimum 0
Max 1
1
1 if commit is unverified 0
Sum 1

@github-actions github-actions bot added documentation Improvements or additions to documentation typescript labels Apr 14, 2026
Comment thread typescript/create-onchain-agent/src/common/utils.ts
Comment thread typescript/create-onchain-agent/src/common/utils.ts
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

documentation Improvements or additions to documentation typescript

Development

Successfully merging this pull request may close these issues.

Feature: Make create-onchain-agent templates LLM provider-agnostic

3 participants