这是indexloc提供的服务,不要输入任何密码
Skip to content

[BUG]: AnythingLLM When using Gemini's embeddong model when deploying docker, you must use gemini's LLm, otherwise an error No Gemini API key was set will be reported. #2972

@dvchdbfhjd

Description

@dvchdbfhjd

How are you running AnythingLLM?

Docker (local)

What happened?

When I configure AnythingLLM using DeepSeek's LLM api in docker and then use text-embedding-004, it will fail. I checked the log and reported an error No Gemini API key was set. At first, I thought it was a problem with my environment, until I debugged it for an afternoon. Found that the environment variable of EMBEDDING is GEMINI_EMBEDDING_API_KEY. I just discovered that this was the case. After changing the LLM model to gemini-pro, everything became normal.
Is this a bug or is the initial setting like this? If the setting is like this, can you give a more understandable error message?

Are there known steps to reproduce?

The embedding engine is successfully set to text-embedding-004 and LLM is successfully set to DeepSeek

Metadata

Metadata

Assignees

No one assigned

    Labels

    possible bugBug was reported but is not confirmed or is unable to be replicated.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions