OpenAI-Compatible API plugin
The OpenAI-Compatible API plugin (compat_oai
) provides a unified interface for accessing multiple AI providers that implement OpenAI’s API specification. This includes OpenAI, Anthropic, and other compatible services.
Overview
Section titled “Overview”The compat_oai
package serves as a foundation for building plugins that work with OpenAI-compatible APIs. It includes:
- Base Implementation: Common functionality for OpenAI-compatible APIs
- OpenAI Plugin: Direct access to OpenAI’s models and embeddings
- Anthropic Plugin: Access to Claude models through OpenAI-compatible endpoints
- Extensible Framework: Build custom plugins for other compatible providers
Prerequisites
Section titled “Prerequisites”Depending on which provider you use, you’ll need:
- OpenAI: API key from OpenAI API Keys page
- Anthropic: API key from Anthropic Console
- Other providers: API keys from the respective services
OpenAI Provider
Section titled “OpenAI Provider”Configuration
Section titled “Configuration”import "github.com/firebase/genkit/go/plugins/compat_oai/openai"
g, err := genkit.Init(context.Background(), genkit.WithPlugins(&openai.OpenAI{ APIKey: "YOUR_OPENAI_API_KEY", // or set OPENAI_API_KEY env var}))
Supported Models
Section titled “Supported Models”Latest Models
Section titled “Latest Models”- gpt-4.1 - Latest GPT-4.1 with multimodal support
- gpt-4.1-mini - Faster, cost-effective GPT-4.1 variant
- gpt-4.1-nano - Ultra-efficient GPT-4.1 variant
- gpt-4.5-preview - Preview of GPT-4.5 with advanced capabilities
Production Models
Section titled “Production Models”- gpt-4o - Advanced GPT-4 with vision and tool support
- gpt-4o-mini - Fast and cost-effective GPT-4o variant
- gpt-4-turbo - High-performance GPT-4 with large context window
Reasoning Models
Section titled “Reasoning Models”- o3-mini - Latest compact reasoning model
- o1 - Advanced reasoning model for complex problems
- o1-mini - Compact reasoning model
- o1-preview - Preview reasoning model
Legacy Models
Section titled “Legacy Models”- gpt-4 - Original GPT-4 model
- gpt-3.5-turbo - Fast and efficient language model
Embedding Models
Section titled “Embedding Models”- text-embedding-3-large - Most capable embedding model
- text-embedding-3-small - Fast and efficient embedding model
- text-embedding-ada-002 - Legacy embedding model
OpenAI Usage Example
Section titled “OpenAI Usage Example”import ( "github.com/firebase/genkit/go/plugins/compat_oai/openai" "github.com/firebase/genkit/go/plugins/compat_oai")
// Initialize OpenAI pluginoai := &openai.OpenAI{APIKey: "YOUR_API_KEY"}g, err := genkit.Init(ctx, genkit.WithPlugins(oai))
// Use GPT-4o for general tasksmodel := oai.Model(g, "gpt-4o")resp, err := genkit.Generate(ctx, g, ai.WithModel(model), ai.WithPrompt("Explain quantum computing."),)
// Use embeddingsembedder := oai.Embedder(g, "text-embedding-3-large")embeds, err := ai.Embed(ctx, embedder, ai.WithDocs("Hello, world!"))
Anthropic Provider
Section titled “Anthropic Provider”Configuration
Section titled “Configuration”import "github.com/firebase/genkit/go/plugins/compat_oai/anthropic"
g, err := genkit.Init(context.Background(), genkit.WithPlugins(&anthropic.Anthropic{ Opts: []option.RequestOption{ option.WithAPIKey("YOUR_ANTHROPIC_API_KEY"), },}))
Supported Models
Section titled “Supported Models”- claude-3-7-sonnet-20250219 - Latest Claude 3.7 Sonnet with advanced capabilities
- claude-3-5-haiku-20241022 - Fast and efficient Claude 3.5 Haiku
- claude-3-5-sonnet-20240620 - Balanced Claude 3.5 Sonnet
- claude-3-opus-20240229 - Most capable Claude 3 model
- claude-3-haiku-20240307 - Fastest Claude 3 model
Anthropic Usage Example
Section titled “Anthropic Usage Example”import ( "github.com/firebase/genkit/go/plugins/compat_oai/anthropic" "github.com/openai/openai-go/option")
// Initialize Anthropic pluginclaude := &anthropic.Anthropic{ Opts: []option.RequestOption{ option.WithAPIKey("YOUR_ANTHROPIC_API_KEY"), },}g, err := genkit.Init(ctx, genkit.WithPlugins(claude))
// Use Claude for tasks requiring reasoningmodel := claude.Model(g, "claude-3-7-sonnet-20250219")resp, err := genkit.Generate(ctx, g, ai.WithModel(model), ai.WithPrompt("Analyze this complex problem step by step."),)
Using Multiple Providers
Section titled “Using Multiple Providers”You can use both providers in the same application:
import ( "github.com/firebase/genkit/go/plugins/compat_oai/openai" "github.com/firebase/genkit/go/plugins/compat_oai/anthropic")
oai := &openai.OpenAI{APIKey: "YOUR_OPENAI_KEY"}claude := &anthropic.Anthropic{ Opts: []option.RequestOption{ option.WithAPIKey("YOUR_ANTHROPIC_KEY"), },}
g, err := genkit.Init(ctx, genkit.WithPlugins(oai, claude))
// Use OpenAI for embeddings and tool-heavy tasksopenaiModel := oai.Model(g, "gpt-4o")embedder := oai.Embedder(g, "text-embedding-3-large")
// Use Anthropic for reasoning and analysisclaudeModel := claude.Model(g, "claude-3-7-sonnet-20250219")
Advanced Features
Section titled “Advanced Features”Tool Calling
Section titled “Tool Calling”OpenAI models support tool calling:
// Define a toolweatherTool := genkit.DefineTool(g, "get_weather", "Get current weather", func(ctx *ai.ToolContext, input struct{City string}) (string, error) { return fmt.Sprintf("It's sunny in %s", input.City), nil })
// Use with GPT models (tools not supported on Claude via OpenAI API)model := oai.Model(g, "gpt-4o")resp, err := genkit.Generate(ctx, g, ai.WithModel(model), ai.WithPrompt("What's the weather like in San Francisco?"), ai.WithTools(weatherTool),)
Multimodal Support
Section titled “Multimodal Support”Both providers support vision capabilities:
// Works with GPT-4o and Claude modelsresp, err := genkit.Generate(ctx, g, ai.WithModel(model), ai.WithMessages([]*ai.Message{ ai.NewUserMessage( ai.WithTextPart("What do you see in this image?"), ai.WithMediaPart("image/jpeg", imageData), ), }),)
Streaming
Section titled “Streaming”Both providers support streaming responses:
resp, err := genkit.Generate(ctx, g, ai.WithModel(model), ai.WithPrompt("Write a long explanation."), ai.WithStreaming(func(ctx context.Context, chunk *ai.ModelResponseChunk) error { for _, content := range chunk.Content { fmt.Print(content.Text) } return nil }),)
Configuration
Section titled “Configuration”Common Configuration
Section titled “Common Configuration”Both providers support OpenAI-compatible configuration:
import "github.com/firebase/genkit/go/plugins/compat_oai"
config := &compat_oai.OpenAIConfig{ Temperature: 0.7, MaxOutputTokens: 1000, TopP: 0.9, StopSequences: []string{"END"},}
resp, err := genkit.Generate(ctx, g, ai.WithModel(model), ai.WithPrompt("Your prompt here"), ai.WithConfig(config),)
Advanced Options
Section titled “Advanced Options”import "github.com/openai/openai-go/option"
// Custom base URL for OpenAI-compatible servicesopts := []option.RequestOption{ option.WithAPIKey("YOUR_API_KEY"), option.WithBaseURL("https://your-custom-endpoint.com/v1"), option.WithOrganization("your-org-id"), option.WithHeader("Custom-Header", "value"),}