-
Notifications
You must be signed in to change notification settings - Fork 307
Description
Description of the bug:
When using sendMessageStream
, the response payload for response
object is not the same as sendMessage
. There are several things odd about it, and the end result is that the messages do not get added to the internal history. This causes subsequent messages sent on the chat session to essentially be starting from scratch. This means you can't use function calling or anything, unless you persist your own history and pass the full object in every time you call sendMessageStream
Actual vs expected behavior:
Here is a simple node-ts
script you can run with node-ts <file>.ts
(assuming you have node-ts
). This simple implementation faces the error I'm describing
import { GoogleGenerativeAI } from "@google/generative-ai"
const MODEL = "gemini-2.0-flash-exp"
// const MODEL = "gemini-1.5-flash"
// works
async function testSendMessage() {
const genAI = new GoogleGenerativeAI(process.env.GOOGLE_API_KEY)
const model = genAI.getGenerativeModel({
model: MODEL,
systemInstruction: "You are a helpful assistant.",
})
const chat = model.startChat()
const response = await chat.sendMessage("Hello, how are you?")
console.log("response:", response)
console.log("history:", (await chat.getHistory()).length) // will be "2"
}
// "works", but history is not appended
async function testSendMessageStream() {
const genAI = new GoogleGenerativeAI(env.GOOGLE_API_KEY)
const model = genAI.getGenerativeModel({
model: MODEL,
systemInstruction: "You are a helpful assistant.",
})
const chat = model.startChat()
const response = await chat.sendMessageStream("Hello, how are you?")
console.log("response:", await response.response)
console.log("history:", (await chat.getHistory()).length) // will be "0"
}
async function test() {
await testSendMessage()
await testSendMessageStream()
}
test()
The response of sendMessage
is of the form
response: {
candidates: [ [Object] ],
usageMetadata: { promptTokenCount: 14, totalTokenCount: 14 },
modelVersion: 'gemini-2.0-flash-exp',
text: [Function (anonymous)],
functionCall: [Function (anonymous)],
functionCalls: [Function (anonymous)]
}
while the response of `sendMessageStream is
{
promptFeedback: undefined,
candidates: [
undefined: { // <- issue
index: undefined,
citationMetadata: undefined,
groundingMetadata: undefined,
finishReason: 'STOP',
finishMessage: undefined,
safetyRatings: [Array],
content: [Object]
}
],
usageMetadata: {
promptTokenCount: 14,
candidatesTokenCount: 48,
totalTokenCount: 62
},
text: [Function (anonymous)],
functionCall: [Function (anonymous)],
functionCalls: [Function (anonymous)]
}
For some reason, the candidates object is being placed onto the undefined
key in the array, which causes the length of the array to be zero, thus no history is added
https://github.com/google-gemini/generative-ai-js/blob/main/src/methods/chat-session.ts#L185
Any other information you'd like to share?
I've noticed this issue with both 2.0 flash and 1.5 flash. I feel like I must be doing something wrong or very few people are using this sdk with streaming and function calling, as I'm surprised to be the first to notice this.
Thanks for your help in advance!