+
Skip to content

aisingapore/sealion-tgi

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AI Singapore SEA-LION model served by Text Generation Inference (TGI) with Docker Compose

Model

Requirements

Quick Start

  • Start the service.
    docker compose up
  • TGI is deployed as a server that implements the OpenAI API protocol. The server can be queried via http://localhost:8080 in the same format as the OpenAI API. For example:
    curl http://localhost:8080/v1/completions \
      -H "Content-Type: application/json" \
      -d '{
          "model": "Llama-SEA-LION-v3-8B-IT",
          "prompt": "Artificial Intelligence is",
          "max_tokens": 50,
          "temperature": 0.8,
          "repetition_penalty": 1.2
      }'

About

Serve the AI Singapore SEA-LION model ⚛ with TGI

Topics

Resources

License

Stars

Watchers

Forks

Languages

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载