-
-
Notifications
You must be signed in to change notification settings - Fork 5.4k
Closed
Labels
Description
What would you like to see?
When we send a chat like What is AnythingLLM the system will find relevant context, if any and append it under the system prompt.
{SYSTEM}
<context>
<context>
# No history
<user prompt>However, on follow up request, where we may not get a vector search result due to ambiguity like tell me some features
We will get a flow like
{SYSTEM}
# NO CONTEXT!
<reply from prompt 1 from LLM>
<user prompt>This puts 100% of the response result on the <reply from prompt 1 from LLM> response where we could leverage the previous citations in the history to help offload the responsibility and scope of the answering.