这是indexloc提供的服务,不要输入任何密码
Skip to content

Conversation

@kaixuanliu
Copy link
Contributor

Add support for intel XPU platform in boft_controlnet example

@BenjaminBossan
Copy link
Member

@yao-matrix Is this something that you could check?

@kaixuanliu kaixuanliu changed the title add xpu support for this example add xpu support for boft/controlnet example Jul 28, 2025
@yao-matrix
Copy link
Contributor

@yao-matrix Is this something that you could check?

Sure, Kaixuan is from my team(intel HuggingFace engineering team), we are now executing the plan discussed w/ you before, thx very much.


# Determine the best available device
if torch.xpu.is_available():
device = "xpu:0"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kaixuanliu, we can set device to "xpu" here in xpu case before we enable xpu support in facealignment and leave a TODO comment here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kaixuanliu Sorry, I mean "cpu", to make sure this example can work even before face_alignment xpu support enabled. And pls mark the comment as resolved once you fixed the comments, thx.

diffusers==0.17.1
transformers=>4.48.0
accelerate==0.25.0
transformers==4.48.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could we use the latest transformers, seems 4.48 is a bit outdated?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done, updated transformers/diffusers version to latest

if torch.xpu.is_available():
torch.xpu.empty_cache()
self.begin = torch.xpu.memory_allocated()
self.xpu_peak_start = torch.xpu.max_memory_allocated()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why xpu need an extra xpu_peak_start here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

deleted

@BenjaminBossan
Copy link
Member

Kaixuan is from my team(intel HuggingFace engineering team)

Ah I see, thanks for the info and for the review.

Signed-off-by: Liu, Kaixuan <kaixuan.liu@intel.com>
Signed-off-by: Liu, Kaixuan <kaixuan.liu@intel.com>
@kaixuanliu kaixuanliu marked this pull request as ready for review July 29, 2025 14:14
Signed-off-by: Liu, Kaixuan <kaixuan.liu@intel.com>
Signed-off-by: Liu, Kaixuan <kaixuan.liu@intel.com>
@kaixuanliu
Copy link
Contributor Author

@yao-matrix @BenjaminBossan have updated the code following the review comments, pls help review again.

Signed-off-by: Liu, Kaixuan <kaixuan.liu@intel.com>
Copy link
Member

@BenjaminBossan BenjaminBossan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, the PR looks good. Just a small issue with wording. Also, please run make style to ensure that the linter is happy.

Co-authored-by: Benjamin Bossan <BenjaminBossan@users.noreply.github.com>

# Determine the best available device
if torch.xpu.is_available():
device = "xpu:0"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kaixuanliu Sorry, I mean "cpu", to make sure this example can work even before face_alignment xpu support enabled. And pls mark the comment as resolved once you fixed the comments, thx.

if args.enable_xformers_memory_efficient_attention:
if is_xformers_available():
if accelerator.device.type == "xpu":
logger.warning("XPU doesn't support xformers yet, xformers is not applied.")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

how about we align to "XPU hasn't support xformers yet, ignore it."?

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Member

@BenjaminBossan BenjaminBossan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for updating this example, LGTM.

I canceled the CI test runs, as they don't cover examples.

@BenjaminBossan BenjaminBossan merged commit 49b29c1 into huggingface:main Aug 4, 2025
2 of 14 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants