+
Skip to content

Conversation

Lucaskabela
Copy link
Contributor

@Lucaskabela Lucaskabela commented Mar 26, 2025

Description

Add LLM Collector, which is a subset of the SyncDatacollector aimed at trimming to core funcionality in an easy to consume API for LLM devs

Motivation and Context

#2872

Types of changes

What types of changes does your code introduce? Remove all that do not apply:

  • New feature (non-breaking change which adds core functionality)

Checklist

Go over all the following points, and put an x in all the boxes that apply.
If you are unsure about any of these, don't hesitate to ask. We are here to help!

  • I have read the CONTRIBUTION guide (required)
  • My change requires a change to the documentation.
  • I have updated the tests accordingly (required for a bug fix or a new feature).
  • I have updated the documentation accordingly.

Testing

python test/test_collector.py TestLLMCollector

Copy link

pytorch-bot bot commented Mar 26, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/rl/2879

Note: Links to docs will display an error until the docs builds have been completed.

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Mar 26, 2025
@Lucaskabela Lucaskabela requested a review from vmoens March 26, 2025 21:10
@Lucaskabela Lucaskabela changed the title Initial collector [DRAFT] Initial LLM collector Mar 26, 2025
@Lucaskabela Lucaskabela changed the title [DRAFT] Initial LLM collector Initial LLM collector Mar 31, 2025
@Lucaskabela
Copy link
Contributor Author

Originally, this code ran into the error:

Traceback (most recent call last):
  File "/home/lucaskabela/.conda/envs/cabernet/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
    self.run()
  File "/home/lucaskabela/.conda/envs/cabernet/lib/python3.10/threading.py", line 953, in run
    self._target(*self._args, **self._kwargs)
  File "/home/lucaskabela/.conda/envs/cabernet/lib/python3.10/site-packages/vllm/v1/engine/core.py", line 429, in process_input_socket
    request = decoder.decode(data_frame.buffer)
  File "/home/lucaskabela/.conda/envs/cabernet/lib/python3.10/site-packages/vllm/v1/serial_utils.py", line 34, in decode
    return self.decoder.decode(obj)
msgspec.ValidationError: Expected `int | null`, got `bool` - at `$[6].logprobs`

This seems to have come from a versioning issue; downgrading torch and vllm with

pip install --force torch==2.5.1 vllm==0.7.3

Then

python setup.py clean && python setup.py develop

Did the trick and fixed this error :) Previously, I had torch==2.6.0 and vllm==0.8.2, so note these versions may have cause this error

@vmoens vmoens force-pushed the lucaskabela/llm_dedicated_collector branch 2 times, most recently from b133f71 to e08e2a6 Compare April 2, 2025 13:36
@vmoens
Copy link
Collaborator

vmoens commented Apr 2, 2025

Amazing!
I did a bit of refactoring.
Things that are missing:

  • Tests in test/test_collectors.py. These should include the vLLMWrapper and the TransformersWrapper
  • docstrings + register in the docs directory
  • ultimately remove the if __name__ == "__main__": from the file
  • use trust_policy=True as not doing so can trigger checks that we want to avoid

@vmoens vmoens added the enhancement New feature or request label Apr 2, 2025
@Lucaskabela
Copy link
Contributor Author

Amazing! I did a bit of refactoring. Things that are missing:

  • Tests in test/test_collectors.py. These should include the vLLMWrapper and the TransformersWrapper
  • docstrings + register in the docs directory
  • ultimately remove the if __name__ == "__main__": from the file
  • use trust_policy=True as not doing so can trigger checks that we want to avoid

Thanks for the comments here; I think I incorporated the majority of this feedback and updated the test on my end; the only thing I am not sure about is how to register in the docs directory - could you provide more details on this?

Copy link
Collaborator

@vmoens vmoens left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good progress thanks for working on that!
You can write the class there: https://github.com/pytorch/rl/blob/main/docs/source/reference/collectors.rst

I would add a section about LLM collectors, that'll be useful in the future

@Lucaskabela Lucaskabela force-pushed the lucaskabela/llm_dedicated_collector branch from 28c5d5c to ad98710 Compare April 3, 2025 17:45
@vmoens vmoens changed the title Initial LLM collector [Feature] LLM collector Apr 4, 2025
@vmoens vmoens force-pushed the lucaskabela/llm_dedicated_collector branch from 7ea1c0b to 4be5048 Compare April 4, 2025 10:53
Copy link
Collaborator

@vmoens vmoens left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM thanks!

@vmoens vmoens force-pushed the lucaskabela/llm_dedicated_collector branch 2 times, most recently from bcb4535 to f0f48be Compare April 4, 2025 12:01
@vmoens vmoens force-pushed the lucaskabela/llm_dedicated_collector branch from f0f48be to dfe6410 Compare April 4, 2025 12:02
@vmoens vmoens merged commit 4135a83 into main Apr 4, 2025
5 checks passed
@vmoens vmoens deleted the lucaskabela/llm_dedicated_collector branch May 14, 2025 09:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载