这是indexloc提供的服务,不要输入任何密码
Skip to content

Conversation

@g-hano
Copy link
Contributor

@g-hano g-hano commented Feb 9, 2025

Overview

This PR adds model_kwargs support, enhancing LLM parameter customization. Fixes #12

Adding Ollama support to ReAG by enhancing the ReagClient class to support local model inference and improving response parsing.

Changes Made

  1. Enhanced ReagClient initialization:
    • Added model_kwargs parameter for fine-grained LLM contro

Environment

  • Python: 3.11.3
  • ReAG version: 0.0.4

Implementation Details

The implementation adds support for:

  • Custom model parameters via model_kwargs (temperature, top_p, etc.)

Example Usage

from reag.client import ReagClient, Document

async with ReagClient(
      model="ollama/deepseek-r1:7b",
      model_kwargs={"api_base": "http://localhost:11434"}
   ) as client:
        docs = [
            Document(
                name="Superagent",
                content="Superagent is a workspace for AI-agents that learn, perform work, and collaborate.",
                metadata={
                    "url": "https://superagent.sh",
                    "source": "web",
                },
            ),
        ]
        response = await client.query("What is Superagent?", documents=docs)

@homanp
Copy link
Collaborator

homanp commented Feb 9, 2025

@g-hano thanks for this! Approved, will merge and publish shortly

@homanp homanp merged commit b3e1f90 into superagent-ai:main Feb 9, 2025
@homanp
Copy link
Collaborator

homanp commented Feb 9, 2025

published in v0.0.5

@homanp homanp added the enhancement New feature or request label Feb 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

New Feature: Add model_kwargs support for customizing LLM generation parameters

2 participants