这是indexloc提供的服务,不要输入任何密码
Skip to content

Conversation

@elisalimli
Copy link
Contributor

@elisalimli elisalimli commented Mar 13, 2024

Summary

This PR allows you to set LLM params - max_tokens, temperature. Both for superagent and oss LLMs.
Fixes #890

Depends on

Test plan

@render
Copy link

render bot commented Mar 13, 2024

@vercel
Copy link

vercel bot commented Mar 13, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
superagent-ui ✅ Ready (Inspect) Visit Preview 💬 Add feedback Mar 13, 2024 4:20pm

@elisalimli elisalimli requested a review from homanp March 13, 2024 16:19
@elisalimli elisalimli marked this pull request as ready for review March 13, 2024 16:19
@homanp homanp self-assigned this Mar 13, 2024
@homanp homanp added the enhancement New feature or request label Mar 13, 2024
@homanp
Copy link
Collaborator

homanp commented Mar 13, 2024

@elisalimli can you provide an example on how to use this?

@elisalimli
Copy link
Contributor Author

@elisalimli can you provide an example on how to use this?

@homanp I have updated. Is there anything that prevents us to merge this PR?

https://gist.github.com/homanp/5437e5c2e99a302f023f203d139fb3b3?permalink_comment_id=4986256#gistcomment-4986256

@homanp homanp merged commit ef5c0f1 into main Mar 17, 2024
@elisalimli elisalimli deleted the feat/saml-llm-params branch March 17, 2024 10:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

SAML - Setting LLM parameters.

3 participants