-
Notifications
You must be signed in to change notification settings - Fork 2.1k
add xpu support for boft/controlnet example #2674
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add xpu support for boft/controlnet example #2674
Conversation
|
@yao-matrix Is this something that you could check? |
Sure, Kaixuan is from my team(intel HuggingFace engineering team), we are now executing the plan discussed w/ you before, thx very much. |
|
|
||
| # Determine the best available device | ||
| if torch.xpu.is_available(): | ||
| device = "xpu:0" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@kaixuanliu, we can set device to "xpu" here in xpu case before we enable xpu support in facealignment and leave a TODO comment here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@kaixuanliu Sorry, I mean "cpu", to make sure this example can work even before face_alignment xpu support enabled. And pls mark the comment as resolved once you fixed the comments, thx.
| diffusers==0.17.1 | ||
| transformers=>4.48.0 | ||
| accelerate==0.25.0 | ||
| transformers==4.48.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could we use the latest transformers, seems 4.48 is a bit outdated?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done, updated transformers/diffusers version to latest
| if torch.xpu.is_available(): | ||
| torch.xpu.empty_cache() | ||
| self.begin = torch.xpu.memory_allocated() | ||
| self.xpu_peak_start = torch.xpu.max_memory_allocated() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why xpu need an extra xpu_peak_start here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
deleted
Ah I see, thanks for the info and for the review. |
Signed-off-by: Liu, Kaixuan <kaixuan.liu@intel.com>
Signed-off-by: Liu, Kaixuan <kaixuan.liu@intel.com>
Signed-off-by: Liu, Kaixuan <kaixuan.liu@intel.com>
Signed-off-by: Liu, Kaixuan <kaixuan.liu@intel.com>
|
@yao-matrix @BenjaminBossan have updated the code following the review comments, pls help review again. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, the PR looks good. Just a small issue with wording. Also, please run make style to ensure that the linter is happy.
Co-authored-by: Benjamin Bossan <BenjaminBossan@users.noreply.github.com>
|
|
||
| # Determine the best available device | ||
| if torch.xpu.is_available(): | ||
| device = "xpu:0" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@kaixuanliu Sorry, I mean "cpu", to make sure this example can work even before face_alignment xpu support enabled. And pls mark the comment as resolved once you fixed the comments, thx.
| if args.enable_xformers_memory_efficient_attention: | ||
| if is_xformers_available(): | ||
| if accelerator.device.type == "xpu": | ||
| logger.warning("XPU doesn't support xformers yet, xformers is not applied.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
how about we align to "XPU hasn't support xformers yet, ignore it."?
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
BenjaminBossan
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for updating this example, LGTM.
I canceled the CI test runs, as they don't cover examples.
Add support for intel XPU platform in boft_controlnet example