-
Notifications
You must be signed in to change notification settings - Fork 90
Fix for streaming uploads #271
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
Adds support for streaming async-iterable request bodies and a test to validate error propagation when a generator raises.
- Introduce logic in
_request_mockto consume async iterables indataand replace them with aggregated bytes. - Add
test_async_generator_body_exceptionto ensure a generator exception bubbles up.
Reviewed Changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.
| File | Description |
|---|---|
| tests/test_async_generator_body.py | New async test asserting generator errors are propagated |
| aioresponses/core.py | Handle async-iterable data by iterating chunks into bytes |
Comments suppressed due to low confidence (1)
tests/test_async_generator_body.py:12
- [nitpick] Consider adding a complementary test case for successful streaming (no errors) to verify that chunks from an async generator are properly aggregated and sent as the request body.
async def data_generator():
| try: | ||
| body_bytes = b"" | ||
| async for chunk in data: | ||
| body_bytes += chunk |
Copilot
AI
Jun 10, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Accumulating bytes by repeated concatenation can lead to quadratic memory usage for large streams; consider appending chunks to a list and doing b''.join(chunks) after iteration.
| try: | ||
| body_bytes = b"" | ||
| async for chunk in data: | ||
| body_bytes += chunk | ||
| kwargs['data'] = body_bytes | ||
| except Exception: | ||
| raise |
Copilot
AI
Jun 10, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The try/except that immediately re-raises the caught exception is redundant; you can remove the try/except block and let exceptions propagate naturally to simplify the code.
| try: | |
| body_bytes = b"" | |
| async for chunk in data: | |
| body_bytes += chunk | |
| kwargs['data'] = body_bytes | |
| except Exception: | |
| raise | |
| body_bytes = b"" | |
| async for chunk in data: | |
| body_bytes += chunk | |
| kwargs['data'] = body_bytes |
fix for #270