这是indexloc提供的服务,不要输入任何密码
Skip to content

Local LLM Support #118

@timothycarambat

Description

@timothycarambat

Add support for someone to be able to use a Locally hosted LLM.

  • The LocalLLM must have an API wrapper like GPT4ALL
  • This will not allow an arbitrary LLM running inferences to just interact with the system, we will not be building API wrappers for LLMs.

References for easier integration:
https://docs.gpt4all.io/
LocalAI
Replicant
OLLama
LLMStudio
HuggingFace

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    Status

    Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions