这是indexloc提供的服务,不要输入任何密码
Skip to content

[FEAT]: Disable streaming for LLM Provider(s) with ENV setting #4063

@timothycarambat

Description

@timothycarambat

What would you like to see?

is there a way to tell anythingllm that my generic openai provider does not support streaming?

We should have a way to specify or override if a provider is allowed to stream or not. All LLM providers support both chat and stream, so disabling this should be done via a flag.

We can apply this to the Generic OpenAI provider at this time only, since the others are pinned to a service and there is no reason to disable streaming, since that is a better experience.

Metadata

Metadata

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions