这是indexloc提供的服务,不要输入任何密码
Skip to content

Conversation

@shatfield4
Copy link
Collaborator

Pull Request Type

  • ✨ feat
  • 🐛 fix
  • ♻️ refactor
  • 💄 style
  • 🔨 chore
  • 📝 docs

Relevant Issues

resolves #1342

What is in this change?

  • Add max tokens field to generic OpenAI LLM connector to allow for more control over responses

Additional Information

Developer Validations

  • I ran yarn lint from the root of the repo & committed changes
  • Relevant documentation has been updated
  • I have tested my code functionality
  • Docker build succeeds locally

@shatfield4 shatfield4 linked an issue May 10, 2024 that may be closed by this pull request
@shatfield4 shatfield4 added the PR:needs review Needs review by core team label May 10, 2024
Copy link
Member

@timothycarambat timothycarambat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Everything else looks good, lets just get this into agent integration as well

@timothycarambat timothycarambat removed the PR:needs review Needs review by core team label May 10, 2024
@timothycarambat timothycarambat merged commit 0a6a9e4 into master May 10, 2024
@timothycarambat timothycarambat deleted the 1342-feat-be-able-to-set-max-tokens-generated-in-llm-setup branch May 10, 2024 21:49
CrackerCat pushed a commit to CrackerCat/anything-llm that referenced this pull request Jul 31, 2024
…Labs#1345)

* add max tokens field to generic openai llm connector

* add max_tokens property to generic openai agent provider
CrackerCat pushed a commit to CrackerCat/anything-llm that referenced this pull request Aug 1, 2024
…Labs#1345)

* add max tokens field to generic openai llm connector

* add max_tokens property to generic openai agent provider
CrackerCat pushed a commit to CrackerCat/anything-llm that referenced this pull request Aug 2, 2024
…Labs#1345)

* add max tokens field to generic openai llm connector

* add max_tokens property to generic openai agent provider
CrackerCat pushed a commit to CrackerCat/anything-llm that referenced this pull request Aug 3, 2024
…Labs#1345)

* add max tokens field to generic openai llm connector

* add max_tokens property to generic openai agent provider
cabwds pushed a commit to cabwds/anything-llm that referenced this pull request Jul 3, 2025
…Labs#1345)

* add max tokens field to generic openai llm connector

* add max_tokens property to generic openai agent provider
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[FEAT]: Be able to set max tokens generated in LLM setup

3 participants