+
Skip to content

Conversation

HenryL27
Copy link
Collaborator

Implements async mode for llm_map and llm_map_elements.
Also adds asynchronous llm implementations for anthropic and gemini (+exponential backoff for openai)

I left out bedrock bc bedrock async looks like "start job, poll get job until job is complete, then download result from s3 (oh by the way tell me where in s3 to put the result" and I didn't want to deal with that. (there might be better ways - there's a converse stream client method that can give you an async iterator over the response that might do the trick but that's a whole 'nother API to deal with)

…i llm impl

Signed-off-by: Henry Lindeman <hmlindeman@yahoo.com>
Signed-off-by: Henry Lindeman <hmlindeman@yahoo.com>
…rs in testing so I think the client handles that automatically

Signed-off-by: Henry Lindeman <hmlindeman@yahoo.com>
@HenryL27 HenryL27 requested a review from bsowell February 26, 2025 20:58
Copy link
Contributor

@bsowell bsowell left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As far as I can tell this looks fine. Is there a useful test we can write?

Signed-off-by: Henry Lindeman <hmlindeman@yahoo.com>
@HenryL27 HenryL27 merged commit f4f3032 into main Feb 27, 2025
12 of 15 checks passed
@HenryL27 HenryL27 deleted the hml-llm-async branch February 27, 2025 17:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载