这是indexloc提供的服务,不要输入任何密码
Skip to content

feat: Add AI/ML API (aimlapi) as new LLM provider #4156

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 7 commits into
base: master
Choose a base branch
from

Conversation

D1m7asis
Copy link

@D1m7asis D1m7asis commented Jul 15, 2025

Introduces support for the AI/ML API (aimlapi) as a new LLM provider across the frontend and backend. Adds configuration options, provider selection UI, privacy info, and model selection for aimlapi. Implements backend integration for chat completions, agent support, and custom model listing. Updates environment examples and helper utilities to support the new provider.

Pull Request Type

  • ✨ feat
  • 🐛 fix
  • ♻️ refactor
  • 💄 style
  • 🔨 chore
  • 📝 docs

Relevant Issues

resolves #3945

What is in this change?

  • Environment configuration:

    • Updated docker/.env.example and server/.env.example with AIML_API_KEY, AIML_MODEL_PREF, and embedding engine entries.
  • Frontend UI:

    • Added AimlApiOptions components under components/LLMSelection and components/EmbeddingSelection.
    • Registered “AI/ML API” in LLM and embedding provider lists, onboarding flow, and workspace settings.
    • Included provider logo (aimlapi.png) and privacy description.
  • Backend integration:

    • Created server/utils/AiProviders/aimlapi and server/utils/EmbeddingEngines/aimlapi modules for chat and embedding APIs.
    • Wired into systemSettings.js, customModels.js, and helpers/*.js for provider selection and model caching.
    • Enabled agent support in server/utils/agents for aimlapi.
  • Helper updates:

    • Extended getCustomModels, getLLMProvider, and getEmbeddingEngineSelection to include aimlapi.
    • Updated updateENV.js validation to accept aimlapi as a valid provider.

Additional Information

This change ensures seamless integration of AI/ML API models alongside existing providers and maintains consistent UX patterns for model selection and configuration.

Developer Validations

  • I ran yarn lint from the root of the repo & committed changes
  • Relevant documentation has been updated
  • I have tested my code functionality
  • Docker build succeeds locally

D1m7asis and others added 3 commits July 15, 2025 23:57
Introduces support for the AI/ML API (aimlapi) as a new LLM provider across the frontend and backend. Adds configuration options, provider selection UI, privacy info, and model selection for aimlapi. Implements backend integration for chat completions, agent support, and custom model listing. Updates environment examples and helper utilities to support the new provider.
Copy link
Collaborator

@shatfield4 shatfield4 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All the code here LGTM, I am getting 404s on some models during testing here. I do see that the Anthropic API is having downtime ATM so this may be the cause of it.

I patched a bug here where the model and API key were not saving correctly. Other than that all looks good!

@shatfield4 shatfield4 added the PR:needs review Needs review by core team label Jul 18, 2025
Copy link
Member

@timothycarambat timothycarambat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have not pulled the code for funcitonality but there are a few issues/nit-picks I can see just from reviewing the code.

Env naming convention

In LLM_PROVIDER=aimlapi`
AIML_API_KEY => AIML_LLM_API_KEY

In EMBEDDING_ENGINE=aimlapi`
AIML_API_KEY => AIML_EMBEDDER_API_KEY
This avoids a collission where someone may use two API keys for LLM and embedder. We made this mistake a long time ago with OpenAI/Azure OpenAI where they share a key and it keeps side effects from changing one key from impacting the other.

Missing Privacy Entry for Embedder

The data handling page will crash/show an invalid state if the user has the embedder as AIML since there is no data handling entry for it as the embedder. Only the LLM preference has an entry from what I can see.

Logo formatting

  • The image has a transparent bg, it should be white so you can see it in any theme.
  • The logo is also stretched vertically and compressed horizontally. The providers logo should retain its dimensions/aspet and fit in a 330x330 image size. Whitespace around the icon is fine.

Comments

Good idea to cache, querying this many models could be a huge mess

While the same provider, importing the two consts AIMLAPI_HEADERS and AIMLAPI_BASE_URL is probably better both moved to two class properties.

Use of Untooled for AI/ML Agent provider:

While I have not tested, shouldnt this API work with tools via the official tool implementation? Untooled is our internal hack around non-tool calling models for now and has its own drawbacks. If this provider wraps tool calls when the request is set we should stick with the OpenAI tool call implementation instead of Untools since we want to kill that feature in the future anyway.


If we need to use Untooled then no big deal, not a problem just a suggestion.

Clarification

Please dont take any feedback critically. Some people get upset when we suggest feedback but the main thing we try to do is make it easier for us to maintain code that is merged since ultimately we become the maintainers after the initial merge. So normalization and consistency across providers and more is very important to us, which most of my comments above have to do with

@D1m7asis
Copy link
Author

Hi Timothy,

Thank you for the detailed feedback, it's incredibly helpful. I appreciate the time you took to go through the code so thoroughly. Your points all make perfect sense and will be addressed. I completely understand and value the importance of consistency and maintainability, especially when the code becomes part of a larger shared base. Again, thanks a lot for your time. I`ll fix issues ASAP.

Standardizes environment variable names for AI/ML API keys, separating LLM and embedding keys as AIML_LLM_API_KEY and AIML_EMBEDDER_API_KEY. Updates frontend, backend, and documentation to use the new variable names, improves provider integration, and adds privacy info for AI/ML API embedding. Also updates the AimlApiProvider implementation and related helper logic.
@D1m7asis D1m7asis requested a review from timothycarambat July 21, 2025 17:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
PR:needs review Needs review by core team
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[FEAT]: add AI/ML API as provider
3 participants