How are you running AnythingLLM?
Docker (local)
What happened?
I am using the gemini-1.5-pro-latest model and uploaded a local document named "goframe-latest_compressed.pdf". I then used the "up-pin form workspace" function to pin the document. During the first question and answer in each thread, the model responds normally. However, from the second question onwards, I receive the following error:
Could not respond to message.
[GoogleGenerativeAI Error]: Content with role 'user' can't follow 'user'. Valid previous roles: {"user":["model"],"function":["model"],"model":["user","function"],"system":[]}
Are there known steps to reproduce?
- Upload a local document ( The download link for my document is here )
- Start a thread and ask a question about the document content. For example:
me: 介绍goframe
- Receive a normal response from the model. For example:
model: GoFrame 简介 GoFrame 是一款模块化、高性能、企业级的 Go 语言应用开发框架....
- Ask a second question within the same thread. For example:
me: 写一个GoFrame 的快速入门示例
- Could not respond to message. [GoogleGenerativeAI Error]: Content with role 'user' can't follow 'user'. Valid previous roles: {"user":["model"],"function":["model"],"model":["user","function"],"system":[]}