-
Notifications
You must be signed in to change notification settings - Fork 10
Open
Labels
enhancementNew feature or requestNew feature or request
Milestone
Description
Problem
Currently jupyterlite-ai includes explicit and built-in support for Ollama.
However Ollama is compatible with the OpenAI API: https://ollama.com/blog/openai-compatibility
So maybe it could make sense to drop support for Ollama as built-in-provider, and instead document how to use the generic OpenAI-compatible provider?
This would also mean dropping the dependency on the third-party provider:
Line 86 in f9186f0
| "ollama-ai-provider-v2": "^1.3.1", |
Proposed Solution
Additional context
The main downside would be to manually provide the base URL for Ollama when using the generic provider.
Unless the UI could suggest some common URLs, for example Ollama, Openrouter and LiteLLM proxy.
brichet and nakul-py
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request