这是indexloc提供的服务,不要输入任何密码
Skip to content

[BUG]: [Desktop] DeepSeek-R1 thinking/reasoning content not displayed in chat, but present in API response #4561

@328664440-droid

Description

@328664440-droid

How are you running AnythingLLM?

AnythingLLM desktop app

What happened?

I'm using AnythingLLM Desktop version on macOS and have encountered an issue where the thinking/reasoning content from the DeepSeek-R1:8b model is not displayed in the chat interface, even though it's clearly present in the raw API response from Ollama.

  1. Setup Environment:

    • macOS Tahoe 26.0.1
    • AnythingLLM Desktop Version
    • Ollama Version 0.12.6
    • Model: deepseek-r1:8b
  2. Configuration:

    • Ollama running on http://127.0.0.1:11434
    • DeepSeek-R1:8b model pulled and running correctly
    • AnythingLLM workspace configured to use this model
  3. Observed Behavior:

    • In AnythingLLM chat: Only final response is displayed, thinking process is missing
    • In Ollama CLI or API: Both thinking process and final response are present

Evidence & Testing

Direct API Test (Working):

curl -X POST http://localhost:11434/api/generate -d '{
  "model": "deepseek-r1:8b",
  "prompt": "Please introduce yourself",
  "stream": false
}'

### Are there known steps to reproduce?

_No response_

Metadata

Metadata

Assignees

No one assigned

    Labels

    possible bugBug was reported but is not confirmed or is unable to be replicated.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions