-
Notifications
You must be signed in to change notification settings - Fork 2.1k
DOC: Explain how to use multiple adapters at the same time #2763
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DOC: Explain how to use multiple adapters at the same time #2763
Conversation
Explain how to use multiple adapters (e.g. 2 LoRA adapters) at the same time, as the API is not quite intuitive and there are some footguns around trainable parameters. This question has come up multiple times in the past (for recent examples, check huggingface#2749 and huggingface#2756). Thus it's a good idea to properly document this.
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
stevhliu
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for adding new docs!
| ## Adapter handling | ||
|
|
||
| ### Using multiple adapters at the same time |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| ## Adapter handling | |
| ### Using multiple adapters at the same time | |
| ## Using multiple adapters at the same time |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's supposed to be a subsection of "Adapter handling", so ###.
| model.base_model.set_adapter(["default", "other"]) | ||
| ``` | ||
|
|
||
| <Tip> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor nit, but it may be nice to reduce the number of lines to use this instead of <Tip> ... </Tip>
> [!TIP]
> ...There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Changed.
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com>
BenjaminBossan
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the review @stevhliu
| ## Adapter handling | ||
|
|
||
| ### Using multiple adapters at the same time |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's supposed to be a subsection of "Adapter handling", so ###.
| model.base_model.set_adapter(["default", "other"]) | ||
| ``` | ||
|
|
||
| <Tip> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Changed.
Explain how to use multiple adapters (e.g. 2 LoRA adapters) at the same time, as the API is not quite intuitive and there are some footguns around trainable parameters.
This question has come up multiple times in the past (for recent examples, check #2749 and #2756). Thus it's a good idea to properly document this.