这是indexloc提供的服务,不要输入任何密码
Skip to content

Conversation

@timothycarambat
Copy link
Member

Replace stream/sync chats with Langchain interface for now
connect #499
ref: #495 (comment)

Replace stream/sync chats with Langchain interface for now
connect #499
ref: #495 (comment)
@review-agent-prime
Copy link

server/utils/AiProviders/ollama/index.js

In the #convertToLangchainPrototypes method, the switch statement can be replaced with a map object for better readability and performance. This will eliminate the need for the break statements and make the code more concise.
Create Issue
See the diff
Checkout the fix

    #convertToLangchainPrototypes(chats = []) {
      const {
        HumanMessage,
        SystemMessage,
        AIMessage,
      } = require("langchain/schema");
      const langchainChats = [];
      const roleToMessageMap = {
        'system': SystemMessage,
        'user': HumanMessage,
        'assistant': AIMessage
      };
      for (const chat of chats) {
        const MessageClass = roleToMessageMap[chat.role];
        if (MessageClass) {
          langchainChats.push(new MessageClass({ content: chat.content }));
        }
      }
      return langchainChats;
    }
git fetch origin && git checkout -b ReviewBot/The-c-0l2k4ac origin/ReviewBot/The-c-0l2k4ac

server/utils/chats/stream.js

In the handleStreamResponses function, there is no error handling for the async iteration over the stream. If an error occurs during the iteration, it will be unhandled and could cause the application to crash. It would be better to wrap the iteration in a try-catch block to handle any potential errors.
Create Issue
See the diff
Checkout the fix

    try {
      for await (const chunk of stream) {
        const content = chunk.hasOwnProperty("content") ? chunk.content : chunk;
        fullText += content;
        writeResponseChunk(response, {
          uuid,
          sources: [],
          type: "textResponseChunk",
          textResponse: content,
          close: false,
          error: false,
        });
      }
    } catch (error) {
      console.error(`Error handling stream: ${error}`);
      // Handle the error appropriately
    }
git fetch origin && git checkout -b ReviewBot/The-c-ikh0c0j origin/ReviewBot/The-c-ikh0c0j

Comment on lines +43 to +57
for (const chat of chats) {
switch (chat.role) {
case "system":
langchainChats.push(new SystemMessage({ content: chat.content }));
break;
case "user":
langchainChats.push(new HumanMessage({ content: chat.content }));
break;
case "assistant":
langchainChats.push(new AIMessage({ content: chat.content }));
break;
default:
break;
}
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Replaced the switch statement in the #convertToLangchainPrototypes method with a map object for better readability and performance.

Suggested change
for (const chat of chats) {
switch (chat.role) {
case "system":
langchainChats.push(new SystemMessage({ content: chat.content }));
break;
case "user":
langchainChats.push(new HumanMessage({ content: chat.content }));
break;
case "assistant":
langchainChats.push(new AIMessage({ content: chat.content }));
break;
default:
break;
}
}
const roleToMessageMap = {
'system': SystemMessage,
'user': HumanMessage,
'assistant': AIMessage
};
for (const chat of chats) {
const MessageClass = roleToMessageMap[chat.role];
if (MessageClass) {
langchainChats.push(new MessageClass({ content: chat.content }));
}
}

@timothycarambat timothycarambat merged commit 2a1202d into master Dec 28, 2023
@timothycarambat timothycarambat deleted the ollama-streaming-patch branch December 28, 2023 21:59
@ThatOneCalculator
Copy link

Confirmed working!

image

cabwds pushed a commit to cabwds/anything-llm that referenced this pull request Jul 3, 2025
Replace stream/sync chats with Langchain interface for now
connect Mintplex-Labs#499
ref: Mintplex-Labs#495 (comment)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants