-
-
Notifications
You must be signed in to change notification settings - Fork 5.4k
Description
How are you running AnythingLLM?
Docker (local)
What happened?
BigInt Serialization Error in AnythingLLM with OpenRouter LLMs
Bug Description
AnythingLLM crashes with BigInt serialization errors when using certain LLM providers (particularly OpenRouter with models like deepseek/deepseek-r1-distill-llama-70b:free) due to multiple JSON.stringify() calls throughout the codebase that cannot handle BigInt values.
Error Messages
[backend] error: Do not know how to serialize a BigInt
[backend] error: TypeError: Cannot read properties of null (reading 'id')
at streamChatWithWorkspace (/app/server/utils/chats/stream.js:278:20)
Environment
- AnythingLLM Version: Latest Docker (tested on commit 14fa079)
- Deployment: Docker container
- LLM Provider: OpenRouter
- Model:
deepseek/deepseek-r1-distill-llama-70b:free - Vector Database: LanceDB
- Embedder: Native embedder
Root Cause
The issue occurs because several parts of the codebase use JSON.stringify() without handling BigInt values properly. When LLM responses contain BigInt values (common with token counts, timestamps, or other numeric metadata), these calls fail and prevent chat creation, leading to null chat objects and subsequent errors.
Affected Files
The following files contain JSON.stringify() calls that need BigInt handling:
/app/server/models/workspaceChats.js- Critical: Chat creation fails here/app/server/models/embedChats.js- Embed chat responses/app/server/utils/logger/index.js- Logger formatting/app/server/utils/helpers/chat/responses.js- Chat response handling/app/server/utils/chats/openaiCompatible.js- OpenAI compatibility layer/app/server/utils/helpers/chat/convertTo.js- Chat conversion utilities/app/server/utils/AiProviders/openRouter/index.js- OpenRouter provider/app/server/utils/chats/stream.js- Null safety for chat.id
Solution
Replace all instances of JSON.stringify(obj) with JSON.stringify(obj, (key, value) => typeof value === "bigint" ? value.toString() : value) to safely convert BigInt values to strings during serialization.
Specific Fixes Applied
1. Fix Chat Creation (Critical)
File: /app/server/models/workspaceChats.js
// Before (line ~18)
response: JSON.stringify(response),
// After
response: JSON.stringify(response, (key, value) => typeof value === "bigint" ? value.toString() : value),2. Fix Embed Chats
File: /app/server/models/embedChats.js
// Fix multiple instances (lines ~28, ~29, ~51)
response: JSON.stringify(response, (key, value) => typeof value === "bigint" ? value.toString() : value),
connection_information: JSON.stringify(connection_information, (key, value) => typeof value === "bigint" ? value.toString() : value),
response: JSON.stringify(responseRest, (key, value) => typeof value === "bigint" ? value.toString() : value)3. Fix Logger
File: /app/server/utils/logger/index.js
// Before (line ~37)
return JSON.stringify(arg);
// After
return JSON.stringify(arg, (key, value) => typeof value === "bigint" ? value.toString() : value);4. Fix Stream Null Safety
File: /app/server/utils/chats/stream.js
// Before (line ~278)
chatId: chat.id,
// After
chatId: chat?.id || null,5. Fix Other Response Handlers
Similar fixes needed in:
/app/server/utils/helpers/chat/responses.js/app/server/utils/chats/openaiCompatible.js/app/server/utils/helpers/chat/convertTo.js/app/server/utils/AiProviders/openRouter/index.js
Testing
After applying these fixes:
- ✅ AnythingLLM starts without errors
- ✅ Chat creation works with OpenRouter LLMs
- ✅ Streaming responses function correctly
- ✅ No BigInt serialization errors in logs
- ✅ Telemetry shows successful
sent_chatevents
Impact
This bug prevents AnythingLLM from working with several popular LLM providers that return BigInt values in their responses, particularly:
- OpenRouter models
- Various open-source models through compatibility layers
- Any provider returning token counts or timestamps as BigInt
Reproduction Steps
- Set up AnythingLLM with OpenRouter as LLM provider
- Configure a model like
deepseek/deepseek-r1-distill-llama-70b:free - Attempt to send a chat message
- Observe BigInt serialization errors and chat creation failures
Suggested Implementation
Consider creating a utility function for BigInt-safe JSON serialization:
// utils/json.js
function safeJsonStringify(obj, ...args) {
return JSON.stringify(obj, (key, value) => typeof value === "bigint" ? value.toString() : value, ...args);
}
module.exports = { safeJsonStringify };Then replace all JSON.stringify() calls with safeJsonStringify() throughout the codebase.
Priority
High - This bug prevents AnythingLLM from functioning with popular LLM providers and causes complete chat failure rather than graceful degradation.
Are there known steps to reproduce?
No response