θΏ™ζ˜―indexlocζδΎ›ηš„ζœεŠ‘οΌŒδΈθ¦θΎ“ε…₯任何密码
Skip to content

Conversation

@timothycarambat
Copy link
Member

Pull Request Type

  • ✨ feat
  • πŸ› fix
  • ♻️ refactor
  • πŸ’„ style
  • πŸ”¨ chore
  • πŸ“ docs

Relevant Issues

resolves #670

What is in this change?

Appropriately handle the LangChain streaming error from the LMChain failing for any reason. Report this error back to the user frontend for chat. Do not crash fatally - disrupting all other client chat streams that may be pending.

Developer Validations

  • I ran yarn lint from the root of the repo & committed changes
  • Relevant documentation has been updated
  • I have tested my code functionality
  • Docker build succeeds locally

@timothycarambat timothycarambat changed the title Resolve fatal crash from Ollama failure Recover from fatal Ollama crash from LangChain library Feb 8, 2024
@Mintplex-Labs Mintplex-Labs deleted a comment from review-agent-prime bot Feb 8, 2024
@timothycarambat timothycarambat merged commit f490c35 into master Feb 8, 2024
@timothycarambat timothycarambat deleted the 670-prevent-ollama-fatal-crash branch February 8, 2024 00:23
@mkhludnev
Copy link

Thank you @timothycarambat !

cabwds pushed a commit to cabwds/anything-llm that referenced this pull request Jul 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG]: exception on calling LLM causes whole process to restart and hurts other clients

3 participants