这是indexloc提供的服务,不要输入任何密码
Skip to content

[Feature Request]:Add support for Ollama cloud models #2347

@LacombeLouis

Description

@LacombeLouis

Do you need to file a feature request?

  • I have searched the existing feature request and this feature request is not already filed.
  • I believe this is a legitimate feature request, not just a question or bug.

Feature Request Description

Description

Add first-class support in lightrag/llm/ollama.py for Ollama "cloud" models so the library can transparently call either local Ollama instances or Ollama’s cloud service. Cloud models are identified by names that include -cloud or :cloud (e.g. gpt-oss:120b-cloud). When a cloud model is detected the code should:

  • default the request host to https://ollama.com when no explicit host is provided
  • accept an API key provided through kwargs (api_key) or fall back to the OLLAMA_API_KEY environment variable

Why

Ollama added cloud-hosted models that run on ollama.com. Users should be able to keep using the same client code whether the model runs locally or on the Ollama cloud, and the SDK should choose the appropriate host and authentication automatically.

Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions