这是indexloc提供的服务,不要输入任何密码
Skip to content

[BUG]: Bad result from RAG #1406

@CharlesBdg

Description

@CharlesBdg

How are you running AnythingLLM?

Docker (local)

What happened?

Hello,
Since 3 weeks, I get really bad result at RAG.
Before around 18th April, I was able to get accurate information from my files (around 150 txt files). Now, the LLM always said he didn't find any relevant context. When I click on "Show citation", the file where the information is, is not showing up.
For example :
I have a file called "How to enable Warp / Zero Trust" (from Cloudflare).
Inside the file it is written "How to enable Warp / Zero Trust".
I ask the question to the LLM "How to enable Warp / Zero Trust", Output : "Sorry I didn't find any relevant context" and the file is not in "Show citation".

LLM : Ollama local / llama3, phi3, openchat, mistral, same output
Embedding : Ollama / mxbai-embed-large
Vector database : LanceDB or Milvus (I've already tried a hard reset of the DB).

I even tested with the Desktop version (v1.5.4) (in case it was a Docker issue) but same issue.

I've already tried tweaking in "Vector Database" section of the Workspace the "Max Context Snippets" and "Document similarity threshold" but no result.

I don't know if something break with the merged of Agent or the bump version in lancedb deps or bump langchain deps or the code that do the RAG.

Maybe add the ability to completely disable the Agent and get back the old RAG.

PS : Great project, thank you

Are there known steps to reproduce?

No response

Metadata

Metadata

Labels

investigatingCore team or maintainer will or is currently looking into this issueneeds info / can't replicateIssues that require additional information and/or cannot currently be replicated, but possible bug

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions