-
Notifications
You must be signed in to change notification settings - Fork 130
Description
Originally raised in this issue:
Oh, sorry, looks like I can just pass baseURL for this
Originally posted by @beautyfree in #392 (comment)
Could be worth improving the docs around this potentially
Originally posted by @0xdevalias in #392 (comment)
At a minimum we should more explicitly document how to achieve this sort of thing using humanify openai --baseURL
.
There may also be benefits to adding more explicit internal support for it as well. Need to read/think deeper into it:
- https://openrouter.ai/
-
The Unified Interface For LLMs
-
Better prices, better uptime, no subscription.
- https://openrouter.ai/rankings?view=trending
-
LLM Rankings
-
- https://openrouter.ai/models?order=pricing-low-to-high
-
Models
-
- https://openrouter.ai/docs/quickstart
-
Quickstart
-
Get started with OpenRouter
-
OpenRouter provides a unified API that gives you access to hundreds of AI models through a single endpoint, while automatically handling fallbacks and selecting the most cost-effective options. Get started with just a few lines of code using your preferred SDK or framework.
- https://openrouter.ai/docs/quickstart#using-the-openai-sdk
-
Using the OpenAI SDK
-
- https://openrouter.ai/docs/quickstart#using-the-openrouter-api-directly
-
Using the OpenRouter API directly
-
- https://openrouter.ai/docs/quickstart#using-third-party-sdks
-
Using third-party SDKs
-
For information about using third-party SDKs and frameworks with OpenRouter, please see our frameworks documentation.
- https://openrouter.ai/docs/community/frameworks
-
Frameworks
-
Using OpenRouter with Frameworks
- It lists a few examples, including...
- https://openrouter.ai/docs/community/frameworks#using-the-openai-sdk
-
Using the OpenAI SDK
-
- https://openrouter.ai/docs/community/frameworks#using-langchain
-
Using LangChain
-
- https://openrouter.ai/docs/community/frameworks#vercel-ai-sdk
-
Vercel AI SDK
-
- https://openrouter.ai/docs/community/frameworks#using-the-openai-sdk
-
-
-
- https://openrouter.ai/docs/features/model-routing
-
Model Routing
-
Dynamically route requests to models
-
The
models
parameter lets you automatically try other models if the primary model’s providers are down, rate-limited, or refuse to reply due to content moderation. -
If the model you selected returns an error, OpenRouter will try to use the fallback model instead. If the fallback model is down or returns an error, OpenRouter will return that error.
-
- https://openrouter.ai/docs/features/provider-routing
-
Provider Routing
-
Route requests to the best provider
-
OpenRouter routes requests to the best available providers for your model. By default, requests are load balanced across the top providers to maximize uptime.
-
- https://openrouter.ai/docs/features/structured-outputs
-
Structured Outputs
-
Return structured data from your models
-
OpenRouter supports structured outputs for compatible models, ensuring responses follow a specific JSON Schema format. This feature is particularly useful when you need consistent, well-formatted responses that can be reliably parsed by your application.
-
-
- https://github.com/OpenRouterTeam/openrouter-examples
-
Examples of integrating the OpenRouter API
-
See Also
- Add ability to use openrouter #392
- Add an API provider abstraction layer to easily support more services/models/APIs #400
- Parallel renames #167
- [Error] Too many requests #2
- Suggestions : Alternative Models, Batch and Auto-renaming #84
- [Idea] Adding Anthropic as a provider #213
- Add better local model support for LM Studio + document current workaround using
humanify openai --baseURL
#419 - etc