+
Skip to content

Cozidian/alchemy

Repository files navigation

ALCHEMY

This is a project I have used for learning about running LLM locally with the ability to extend the LLM cutoff knowledge. This should be fine to run on your own device as long as you choose a LLM model fitting you local specs.

This is not meant for production, but feel free to clone or fork it and make the adjustments you need for the purpose you need.

Quick Start

Important

You need to have elixir 1.18 and erlang 27 installed on your machine.

# pull down the dependencies
mix get.deps
# compile the project
mix compile
# run the project
iex -S mix
# run a query (without streaming response)
ALCHEMY.LlmQueryServer.query("hello")

# run a query with added context (without streaming response)
ALCHEMY.LlmQueryServer.query_with_context("hello")

# run a query with streaming response
ALCHEMY.LlmQueryServer.stream("What is my name?")

# run a query with added context and streaming response
ALCHEMY.LlmQueryServer.stream_with_context("What is my name?")

Adding context

You do this simply as adding .txt files to the input folder. If the folder does not exist, you can create it.

About

A elixir local ai service

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载