npm install- make
.envfile - add/fill
COMMAND_PATH=in.envfile with path to your executable file/build of https://github.com/antimatter15/alpaca.cpp (e.g.path-to-alpaca.cpp/Release/chat.exe) - add/fill
MODEL_PATH=in.envfile with path to your model file (e.g.path-to-model/ggml-alpaca-7b-q4.bin) node index.js- open
localhost:3000in your browser, and you will see message if model is loaded or not - open
localhost:3000/chat?prompt=your_promptin your browser if model is loaded, and you will see response from model - response will be
content-type: text/event-stream
forked from bagusindrayana/alpaca_cpp_node
-
Notifications
You must be signed in to change notification settings - Fork 0
lcsouzamenezes/alpaca_cpp_node
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
About
node wrapper for run alpaca.cpp and get result from request url/api
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published
Languages
- JavaScript 100.0%