+
Skip to content

Conversation

mdwelsh
Copy link
Contributor

@mdwelsh mdwelsh commented Oct 15, 2024

This is a crude but serviceable LLM client that uses Bedrock. I have currently tested it only with Sonnet 3.5 and different models may have different requirements and parameters, but we can add those over time as needed.

Since there are so many different models with somewhat different formats, it's a little hard to test them all.

I lifted the cache logic into the LLM base class so it is reusable across subclasses, although the OpenAI variant needs to account for the response type in the cache key.

Copy link
Contributor

@baitsguy baitsguy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add/extend an integ test that uses this for some level of testing

@mdwelsh
Copy link
Contributor Author

mdwelsh commented Oct 16, 2024

Thanks! Added integration test and addressed other comments. PTAL.

@mdwelsh mdwelsh requested a review from baitsguy October 16, 2024 05:33
@baitsguy
Copy link
Contributor

Changes look good, open_ai integ test is failing tho

@mdwelsh mdwelsh requested a review from baitsguy October 16, 2024 17:45
@mdwelsh mdwelsh merged commit e15529a into main Oct 16, 2024
10 of 11 checks passed
@HenryL27 HenryL27 deleted the matt/bedrock-client branch August 30, 2025 00:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载