θΏ™ζ˜―indexlocζδΎ›ηš„ζœεŠ‘οΌŒδΈθ¦θΎ“ε…₯任何密码
Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions server/models/workspaceChats.js
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ const WorkspaceChats = {
response = {},
user = null,
threadId = null,
include = true,
}) {
try {
const chat = await prisma.workspace_chats.create({
Expand All @@ -16,6 +17,7 @@ const WorkspaceChats = {
response: JSON.stringify(response),
user_id: user?.id || null,
thread_id: threadId,
include,
},
});
return { chat, message: null };
Expand Down
42 changes: 36 additions & 6 deletions server/utils/chats/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -77,15 +77,30 @@ async function chatWithWorkspace(
// User is trying to query-mode chat a workspace that has no data in it - so
// we should exit early as no information can be found under these conditions.
if ((!hasVectorizedSpace || embeddingsCount === 0) && chatMode === "query") {
const textResponse =
workspace?.queryRefusalResponse ??
"There is no relevant information in this workspace to answer your query.";

await WorkspaceChats.new({
workspaceId: workspace.id,
prompt: message,
response: {
text: textResponse,
sources: [],
type: chatMode,
},
threadId: thread?.id || null,
include: false,
user,
});

return {
id: uuid,
type: "textResponse",
sources: [],
close: true,
error: null,
textResponse:
workspace?.queryRefusalResponse ??
"There is no relevant information in this workspace to answer your query.",
textResponse,
};
}

Expand Down Expand Up @@ -172,15 +187,30 @@ async function chatWithWorkspace(
// If in query mode and no context chunks are found from search, backfill, or pins - do not
// let the LLM try to hallucinate a response or use general knowledge and exit early
if (chatMode === "query" && contextTexts.length === 0) {
const textResponse =
workspace?.queryRefusalResponse ??
"There is no relevant information in this workspace to answer your query.";

await WorkspaceChats.new({
workspaceId: workspace.id,
prompt: message,
response: {
text: textResponse,
sources: [],
type: chatMode,
},
threadId: thread?.id || null,
include: false,
user,
});

return {
id: uuid,
type: "textResponse",
sources: [],
close: true,
error: null,
textResponse:
workspace?.queryRefusalResponse ??
"There is no relevant information in this workspace to answer your query.",
textResponse,
};
}

Expand Down
39 changes: 33 additions & 6 deletions server/utils/chats/stream.js
Original file line number Diff line number Diff line change
Expand Up @@ -75,16 +75,29 @@ async function streamChatWithWorkspace(
// User is trying to query-mode chat a workspace that has no data in it - so
// we should exit early as no information can be found under these conditions.
if ((!hasVectorizedSpace || embeddingsCount === 0) && chatMode === "query") {
const textResponse =
workspace?.queryRefusalResponse ??
"There is no relevant information in this workspace to answer your query.";
writeResponseChunk(response, {
id: uuid,
type: "textResponse",
textResponse:
workspace?.queryRefusalResponse ??
"There is no relevant information in this workspace to answer your query.",
textResponse,
sources: [],
close: true,
error: null,
});
await WorkspaceChats.new({
workspaceId: workspace.id,
prompt: message,
response: {
text: textResponse,
sources: [],
type: chatMode,
},
threadId: thread?.id || null,
include: false,
user,
});
return;
}

Expand Down Expand Up @@ -177,16 +190,30 @@ async function streamChatWithWorkspace(
// If in query mode and no context chunks are found from search, backfill, or pins - do not
// let the LLM try to hallucinate a response or use general knowledge and exit early
if (chatMode === "query" && contextTexts.length === 0) {
const textResponse =
workspace?.queryRefusalResponse ??
"There is no relevant information in this workspace to answer your query.";
writeResponseChunk(response, {
id: uuid,
type: "textResponse",
textResponse:
workspace?.queryRefusalResponse ??
"There is no relevant information in this workspace to answer your query.",
textResponse,
sources: [],
close: true,
error: null,
});

await WorkspaceChats.new({
workspaceId: workspace.id,
prompt: message,
response: {
text: textResponse,
sources: [],
type: chatMode,
},
threadId: thread?.id || null,
include: false,
user,
});
return;
}

Expand Down