-
Notifications
You must be signed in to change notification settings - Fork 3.5k
Open
Labels
enhancementNew feature or requestNew feature or request
Description
Do you need to file a feature request?
- I have searched the existing feature request and this feature request is not already filed.
- I believe this is a legitimate feature request, not just a question or bug.
Feature Request Description
Description
Add first-class support in lightrag/llm/ollama.py for Ollama "cloud" models so the library can transparently call either local Ollama instances or Ollama’s cloud service. Cloud models are identified by names that include -cloud or :cloud (e.g. gpt-oss:120b-cloud). When a cloud model is detected the code should:
- default the request host to
https://ollama.comwhen no explicithostis provided - accept an API key provided through kwargs (
api_key) or fall back to theOLLAMA_API_KEYenvironment variable
Why
Ollama added cloud-hosted models that run on ollama.com. Users should be able to keep using the same client code whether the model runs locally or on the Ollama cloud, and the SDK should choose the appropriate host and authentication automatically.
Additional Context
No response
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request