-
-
Notifications
You must be signed in to change notification settings - Fork 5.4k
Closed
Labels
enhancementNew feature or requestNew feature or request
Description
How are you running AnythingLLM?
AnythingLLM desktop app
What happened?
This is encountered when attempting to use the OpenAI-compatible option in AnythingLLM using GitHub Copilot Chat's model endpoint. I believe this is related to this OpenAI Community post. Reproduction steps are included below.
Are there known steps to reproduce?
- Go to https://github.com/marketplace/models/azure-openai/o3-mini/playground
- Click "Use This Model" in the top right corner
- Select "OpenAI SDK" from the SDK dropdown menu
- Enter "https://models.inference.ai.azure.com" as your endpoint in AnythingLLM's OpenAI Compatible setup screen
- Generate a Personal Access Token via GitHub and enter it as your API key
- Set your token context window & max tokens
- Set chat model name to
o3-mini - Send a message in a Workspace chat window
- Observe the error
RascoApps
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request