这是indexloc提供的服务,不要输入任何密码
Skip to content

Drop the Ollama built-in provider? #182

@jtpio

Description

@jtpio

Problem

Currently jupyterlite-ai includes explicit and built-in support for Ollama.

However Ollama is compatible with the OpenAI API: https://ollama.com/blog/openai-compatibility

So maybe it could make sense to drop support for Ollama as built-in-provider, and instead document how to use the generic OpenAI-compatible provider?

This would also mean dropping the dependency on the third-party provider:

"ollama-ai-provider-v2": "^1.3.1",

Proposed Solution

Image

Additional context

The main downside would be to manually provide the base URL for Ollama when using the generic provider.

Unless the UI could suggest some common URLs, for example Ollama, Openrouter and LiteLLM proxy.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions