这是indexloc提供的服务,不要输入任何密码
Skip to content

[FEAT]: Add auto-detection for Ollama and LM Studio context limits #4469

@shatfield4

Description

@shatfield4

What would you like to see?

We should utilize REST endpoints from Ollama and LM Studio to automatically get the max tokens value from the backend. This will improve the UX for using these LLM providers.

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions