How are you running AnythingLLM?
Docker (local)
What happened?
I configure the new API via Generic OpenAI. This API is the deepseek-R1 API from other providers(e.g. siliconflow); after configuring it, I conduct conversations. It only outputs the final result without a thinking process.

Are there known steps to reproduce?
No response