-
-
Notifications
You must be signed in to change notification settings - Fork 3.1k
Closed
Description
I'm using this docker compose file to deploy a front end UI that is very similar to the ChatGPT UI interface.
version: '3.6'
services:
chatgpt:
build: .
# image: ghcr.io/mckaywrigley/chatbot-ui:main
ports:
- 9080:3000
environment:
- 'OPENAI_API_KEY='
- 'OPENAI_API_HOST=http://api:8080'
api:
image: quay.io/go-skynet/llama-cli:v0.4
volumes:
- /Users/Shared/Models:/models
ports:
- 9000:8080
environment:
- MODEL_PATH=/models/7B/gpt4all-lora-quantized.bin
- CONTEXT_SIZE=700
- THREADS=4
command: api
Would it be possible to add API endpoints to mimic the same output as openai? Not sure if it's easier to do here, or to to add a proxy that converts the in/out of each call. But i see value in other tools that normally call openai apis, could simply targer this local instance.
Your thoughts?
Metadata
Metadata
Assignees
Labels
No labels