-
Notifications
You must be signed in to change notification settings - Fork 345
Add usage to GPUTextureViewDescriptor #4746
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Previews, as seen when this build job started (799d217): |
teoxoy
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we also need to update bind group creation validation and the "renderable texture view" validation to refer to the view's usages instead of the texture's.
|
Good catch, thanks! Updated those two algorithms. |
|
I feel like it would be nice to include a note about what this is trying to solve. It's still not clear to me what problem this is solving from #4426 You create You create |
|
So Kai explained to me (which I didn't know). "views" can be actual native API objects and are created by With the change above, creating rgba8unorm texture with storage usage and then creating an rgbaunorm-srgb view of it will fail. You're required to remove the storage usage flag. It would be nice to mention that example in the spec. |
kainino0x
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mostly LGTM
kainino0x
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IIUC after the last discussion on this topic this PR is still the way we decided to go. I'm not sure what Kelsey was going to write up, but I think we have it here already (once comments are addressed).
spec/index.bs
Outdated
| :: | ||
| The allowed {{GPUTextureUsage|usage(s)}} for the texture view. Must be a subset of the | ||
| {{GPUTexture/usage}} flags of the texture. Defaults to the full set of {{GPUTexture/usage}} | ||
| flags of the texture. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, should we error if you request some viewFormat that has zero overlap with the usage of the texture?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure I understand this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Example GPUTextureDescriptor:
{ format: 'rgba8unorm', viewFormats: ['rgba8unorm-srgb'], usage: GPUTextureUsage.STORAGE_BINDING }
This makes it impossible to actually use that viewFormat so should it be an error to create this texture?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Similar to how we disallow usage from being 0.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, got it. So something like (obvious psuedocode) usage & viewFormat.allowedUsages != 0. Makes sense.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TBH I'm not sure that this is needed, it seems to add preemptive validation just because it might be an interesting warning for developers. Maybe it should just be a warning in this case. IIRC there are other cases where we don't error even if the operation makes things that cannot be used.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added a note to the texture validation algorithm encouraging implementations to warn in this scenario.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
After discussion in the WG this needs to be changed to a TODO.
spec/index.bs
Outdated
| : GPUObjectDescriptorBase { | ||
| GPUTextureFormat format; | ||
| GPUTextureViewDimension dimension; | ||
| GPUTextureUsageFlags usage; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note: I imagine for webgpu.h we'll want usage == None to mean inheriting the usages, so we might need something special to say that the usages are exactly 0.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Given that creating a view with no usages won't be useful (there's validation to ensure the texture usages aren't 0) is that a scenario we need to support? If you are asking explicitly for a usage of 0 we can pretty confidently say you are doing it wrong, and thus don't need to allow it when 0 can have a reasonable and useful alternative interpretation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds good, would this be a type error or something still surfaced as a validation error on the GPUDevice? (in both cases it is fairly implementable)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe I misunderstood, but I thought None would be given the value of 0? In which case the default behavior is to inherit, but you can also specify any combination of other usages (which will be validated against the view format and the texture usages.) The only thing you can't do is specify a view usage of 0, which is OK because that's useless anyway. No new validation needed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cleared up the confusion offline. I thought the concern was primarily for native interfaces but it's actually about the JS interface where undefined and 0 are distinct.
My proposal to handle that is to have the dictionary default to 0 and define that a view usage of 0 means the usage is inherited. That way the native and JS interfaces function more similarly and the JS interface still ends up interpreting undefined as "inherit".
The alternative is that we only treat undefined as "inherit", throw a validation error if 0 is passed (which matches the texture validation) and native will either differ and treat 0 as inherit or introduce a new, explicit inherit enum. Given that this route involves more opportunity for errors with no real upside AND makes native and JS differ I'm not in favor of it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
WG approved of defaulting to 0 and having that mean inherit.
92974c7 to
f65c54c
Compare
|
Changed non-normative note to a TODO comment pointing at #4852. PR should match the WG's decisions now. |
Fixes #4426 Adds usage to texture views, defaulting them to the full set of usages from the texture. Validates that they must be a subset of the texture usage and validates that the view format is storage compatible if the `STORAGE_BINDING` usage is specified.
Co-authored-by: Kai Ninomiya <kainino@chromium.org>
GPU Web WG 2024-09-04 Atlantic-time
|
Fixes #4426
Adds usage to texture views, defaulting them to the full set of usages from the texture. Validates that they must be a subset of the texture usage and validates that the view format is storage compatible if the
STORAGE_BINDINGusage is specified.CC @teoxoy and @austinEng