You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[BUG]: AnythingLLM When using Gemini's embeddong model when deploying docker, you must use gemini's LLm, otherwise an error No Gemini API key was set will be reported. #2972
When I configure AnythingLLM using DeepSeek's LLM api in docker and then use text-embedding-004, it will fail. I checked the log and reported an error No Gemini API key was set. At first, I thought it was a problem with my environment, until I debugged it for an afternoon. Found that the environment variable of EMBEDDING is GEMINI_EMBEDDING_API_KEY. I just discovered that this was the case. After changing the LLM model to gemini-pro, everything became normal.
Is this a bug or is the initial setting like this? If the setting is like this, can you give a more understandable error message?
Are there known steps to reproduce?
The embedding engine is successfully set to text-embedding-004 and LLM is successfully set to DeepSeek