+
Skip to content

Support custom models #35

@marcorosa

Description

@marcorosa

Support models hosted in non-AI Core platforms:

  • Azure AI Foundry
  • AWS Bedrock
  • generic vllm servers (like what we have now for mistral -> generalize it)
  • Custom AI Core serving executables (e.g., Ollama or vllm via AI Core)

Metadata

Metadata

Assignees

No one assigned

    Labels

    backendRelated to the flask backend and general Python stuff

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载