How are you running AnythingLLM?
Docker (local)
What happened?
When I configure AnythingLLM using DeepSeek's LLM api in docker and then use text-embedding-004, it will fail. I checked the log and reported an error No Gemini API key was set. At first, I thought it was a problem with my environment, until I debugged it for an afternoon. Found that the environment variable of EMBEDDING is GEMINI_EMBEDDING_API_KEY. I just discovered that this was the case. After changing the LLM model to gemini-pro, everything became normal.
Is this a bug or is the initial setting like this? If the setting is like this, can you give a more understandable error message?
Are there known steps to reproduce?
The embedding engine is successfully set to text-embedding-004 and LLM is successfully set to DeepSeek