Skip to content

docs(llms): fix OpenRouter code example — NameError, redundant base_url, add thinking note#5722

Open
Beandon13 wants to merge 1 commit intocrewAIInc:mainfrom
Beandon13:docs/crewai-openrouter-llm-example-fix
Open

docs(llms): fix OpenRouter code example — NameError, redundant base_url, add thinking note#5722
Beandon13 wants to merge 1 commit intocrewAIInc:mainfrom
Beandon13:docs/crewai-openrouter-llm-example-fix

Conversation

@Beandon13
Copy link
Copy Markdown

Summary

  • The OpenRouter accordion in concepts/llms.mdx (all four language versions: en, ar, pt-BR, ko) contained api_key=OPENROUTER_API_KEY — a bare Python identifier that is never defined, causing an immediate NameError for any user who copies the snippet verbatim
  • The hardcoded base_url="https://openrouter.ai/api/v1" was redundant: OpenAICompatibleCompletion already configures this automatically when provider="openrouter" is inferred from the model prefix
  • Added a note and example showing that thinking={"type": "enabled", ...} now works for Anthropic models routed via OpenRouter (the underlying _prepare_completion_params forwarding fix is in the accompanying code PR)
  • Expanded the model list to include openrouter/anthropic/claude-sonnet-4-5 and openrouter/anthropic/claude-opus-4-5

Testing

Documentation change only — visual review of the corrected code blocks.

…dd thinking note

The OpenRouter accordion in all four language versions of `concepts/llms.mdx`
contained `api_key=OPENROUTER_API_KEY` — a bare Python variable reference that
is never imported, causing a `NameError` for any user who copies the snippet
verbatim.  The `base_url` parameter was also hardcoded to the default value
that the provider already sets automatically.

Changes across docs/en, docs/ar, docs/pt-BR, docs/ko:
- Replace `api_key=OPENROUTER_API_KEY` with comments explaining the auto-resolve
  behavior (reads from `OPENROUTER_API_KEY` env var; `base_url` auto-configured)
- Add `from crewai import LLM` import to the example
- Add extended-thinking example for OpenRouter + Anthropic models, noting the
  `thinking` param works for any LiteLLM-routed provider (now that the
  `_prepare_completion_params` forwarding bug is fixed)
- Expand the model list to include `openrouter/anthropic/claude-sonnet-4-5`

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant