θΏ™ζ˜―indexlocζδΎ›ηš„ζœεŠ‘οΌŒδΈθ¦θΎ“ε…₯任何密码
Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
61 changes: 35 additions & 26 deletions server/utils/AiProviders/apipie/index.js
Original file line number Diff line number Diff line change
@@ -1,8 +1,4 @@
const { NativeEmbedder } = require("../../EmbeddingEngines/native");
const {
handleDefaultStreamResponseV2,
} = require("../../helpers/chat/responses");

const { v4: uuidv4 } = require("uuid");
const {
writeResponseChunk,
Expand Down Expand Up @@ -98,6 +94,24 @@ class ApiPieLLM {
);
}

chatModels() {
const allModels = this.models();
return Object.entries(allModels).reduce(
(chatModels, [modelId, modelInfo]) => {
// Filter for chat models
if (
modelInfo.subtype &&
(modelInfo.subtype.includes("chat") ||
modelInfo.subtype.includes("chatx"))
) {
chatModels[modelId] = modelInfo;
}
return chatModels;
},
{}
);
}

streamingEnabled() {
return "streamGetChatCompletion" in this;
}
Expand All @@ -114,13 +128,13 @@ class ApiPieLLM {
}

promptWindowLimit() {
const availableModels = this.models();
const availableModels = this.chatModels();
return availableModels[this.model]?.maxLength || 4096;
}

async isValidChatCompletionModel(model = "") {
await this.#syncModels();
const availableModels = this.models();
const availableModels = this.chatModels();
return availableModels.hasOwnProperty(model);
}

Expand Down Expand Up @@ -189,22 +203,20 @@ class ApiPieLLM {
return result.choices[0].message.content;
}

// APIPie says it supports streaming, but it does not work across all models and providers.
// Notably, it is not working for OpenRouter models at all.
// async streamGetChatCompletion(messages = null, { temperature = 0.7 }) {
// if (!(await this.isValidChatCompletionModel(this.model)))
// throw new Error(
// `ApiPie chat: ${this.model} is not valid for chat completion!`
// );

// const streamRequest = await this.openai.chat.completions.create({
// model: this.model,
// stream: true,
// messages,
// temperature,
// });
// return streamRequest;
// }
async streamGetChatCompletion(messages = null, { temperature = 0.7 }) {
if (!(await this.isValidChatCompletionModel(this.model)))
throw new Error(
`ApiPie chat: ${this.model} is not valid for chat completion!`
);

const streamRequest = await this.openai.chat.completions.create({
model: this.model,
stream: true,
messages,
temperature,
});
return streamRequest;
}

handleStream(response, stream, responseProps) {
const { uuid = uuidv4(), sources = [] } = responseProps;
Expand Down Expand Up @@ -264,10 +276,6 @@ class ApiPieLLM {
});
}

// handleStream(response, stream, responseProps) {
// return handleDefaultStreamResponseV2(response, stream, responseProps);
// }

// Simple wrapper for dynamic embedder & normalize interface for all LLM implementations
async embedTextInput(textInput) {
return await this.embedder.embedTextInput(textInput);
Expand Down Expand Up @@ -300,6 +308,7 @@ async function fetchApiPieModels(providedApiKey = null) {
id: `${model.provider}/${model.model}`,
name: `${model.provider}/${model.model}`,
organization: model.provider,
subtype: model.subtype,
maxLength: model.max_tokens,
};
});
Expand Down
22 changes: 15 additions & 7 deletions server/utils/helpers/customModels.js
Original file line number Diff line number Diff line change
Expand Up @@ -401,13 +401,21 @@ async function getAPIPieModels(apiKey = null) {
if (!Object.keys(knownModels).length === 0)
return { models: [], error: null };

const models = Object.values(knownModels).map((model) => {
return {
id: model.id,
organization: model.organization,
name: model.name,
};
});
const models = Object.values(knownModels)
.filter((model) => {
// Filter for chat models
return (
model.subtype &&
(model.subtype.includes("chat") || model.subtype.includes("chatx"))
);
})
.map((model) => {
return {
id: model.id,
organization: model.organization,
name: model.name,
};
});
return { models, error: null };
}

Expand Down