From 1f13042667e9f2f19e3477bd5727526dc52dcb75 Mon Sep 17 00:00:00 2001 From: shatfield4 Date: Mon, 30 Jun 2025 18:23:22 -0700 Subject: [PATCH 1/3] add docs for disabling generic openai provider streaming --- pages/configuration.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/pages/configuration.mdx b/pages/configuration.mdx index 4c80bf3..eb96138 100644 --- a/pages/configuration.mdx +++ b/pages/configuration.mdx @@ -66,7 +66,7 @@ This feature is most useful for when you have AnythingLLM as a simple sub-servic **NOTE:** You should enable these configurations _after_ you have enabled multi-user mode, created at least one `admin` user, and have completed the onboarding flow - in the AnythingLLM instance. + in the AnythingLLM instance.
Do **not** enabled these configurations before you have done this or else you may find yourself soft-locked out of the instance until you disable these flags.
From 18a79ba296fa59a12e6f04e0c553ae096f783894 Mon Sep 17 00:00:00 2001 From: shatfield4 Date: Mon, 30 Jun 2025 18:26:02 -0700 Subject: [PATCH 2/3] add docs for disabling generic openai provider streaming --- pages/configuration.mdx | 25 +++++++++++++++++++++++++ 1 file changed, 25 insertions(+) diff --git a/pages/configuration.mdx b/pages/configuration.mdx index eb96138..9fa2d80 100644 --- a/pages/configuration.mdx +++ b/pages/configuration.mdx @@ -195,3 +195,28 @@ COLLECTOR_ALLOW_ANY_IP="true" ### Disable Fully remove or comment out the `COLLECTOR_ALLOW_ANY_IP` environment variable to return to the default behavior. + +## Disable Streaming for Generic OpenAI Provider + + + **Note:** This setting only affects the Generic OpenAI provider and does not impact other LLM providers. Use this when your custom LLM endpoint does not support streaming responses. + + +Modification of the `GENERIC_OPENAI_STREAMING_DISABLED` environment variable allows you to disable streaming responses when using the Generic OpenAI provider. This is particularly useful when you're using a custom LLM that doesn't support streaming responses. + +By default, AnythingLLM attempts to use streaming for a better user experience. However, some custom LLM implementations may not support this feature, resulting in errors or unexpected behavior. + +When this setting is enabled, all responses from your Generic OpenAI provider will be returned as complete responses rather than streamed chunks. + +### Enable + +Set the `GENERIC_OPENAI_STREAMING_DISABLED` environment variable to **_any value_** to enable. + +```bash +# Must be set to a string value of "true" to be effective. +GENERIC_OPENAI_STREAMING_DISABLED="true" +``` + +### Disable + +Fully remove or comment out the `GENERIC_OPENAI_STREAMING_DISABLED` environment variable to return to the default behavior of using streaming responses. From b846ba3e89baa004dddde269abe756be153c3e33 Mon Sep 17 00:00:00 2001 From: shatfield4 Date: Tue, 1 Jul 2025 11:07:20 -0700 Subject: [PATCH 3/3] fix typo in docs for GENERIC_OPENAI_STREAMING_DISABLED --- pages/configuration.mdx | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/pages/configuration.mdx b/pages/configuration.mdx index 9fa2d80..fd14009 100644 --- a/pages/configuration.mdx +++ b/pages/configuration.mdx @@ -210,7 +210,8 @@ When this setting is enabled, all responses from your Generic OpenAI provider wi ### Enable -Set the `GENERIC_OPENAI_STREAMING_DISABLED` environment variable to **_any value_** to enable. +Set the `GENERIC_OPENAI_STREAMING_DISABLED` environment variable to **`"true"`** to enable. +_It must be set to a string value of `"true"` to be effective._ ```bash # Must be set to a string value of "true" to be effective.