这是indexloc提供的服务,不要输入任何密码
Skip to content

[FEAT]: Integration of Litellm as model proxy such as Ollama #1154

@flefevre

Description

@flefevre

What would you like to see?

it could be good to integrate Litellm proxy https://github.com/BerriAI/litellm in addition to ollama
so AnythingLLM could be comaptible with a multiple api model serving

I have seen the post #271 but perhaps it has evolved in the right direction?

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions