-
-
Notifications
You must be signed in to change notification settings - Fork 5.4k
Closed
Labels
possible bugBug was reported but is not confirmed or is unable to be replicated.Bug was reported but is not confirmed or is unable to be replicated.
Description
How are you running AnythingLLM?
AnythingLLM desktop app
What happened?
I'm using AnythingLLM Desktop version on macOS and have encountered an issue where the thinking/reasoning content from the DeepSeek-R1:8b model is not displayed in the chat interface, even though it's clearly present in the raw API response from Ollama.
-
Setup Environment:
- macOS Tahoe 26.0.1
- AnythingLLM Desktop Version
- Ollama Version 0.12.6
- Model:
deepseek-r1:8b
-
Configuration:
- Ollama running on
http://127.0.0.1:11434 - DeepSeek-R1:8b model pulled and running correctly
- AnythingLLM workspace configured to use this model
- Ollama running on
-
Observed Behavior:
- In AnythingLLM chat: Only final response is displayed, thinking process is missing
- In Ollama CLI or API: Both thinking process and final response are present
Evidence & Testing
Direct API Test (Working):
curl -X POST http://localhost:11434/api/generate -d '{
"model": "deepseek-r1:8b",
"prompt": "Please introduce yourself",
"stream": false
}'
### Are there known steps to reproduce?
_No response_Metadata
Metadata
Assignees
Labels
possible bugBug was reported but is not confirmed or is unable to be replicated.Bug was reported but is not confirmed or is unable to be replicated.