-
Notifications
You must be signed in to change notification settings - Fork 329
Minutes 2021 05 10
Corentin Wallez edited this page May 17, 2021
·
1 revision
Chair: Kai/Jeff
Scribe: Ken / others
Location: Google Meet
- Pluralize requestAdapters() and add GPUAdapter.isSoftware #1660
- Add external_texture luminance functions #1681
- Behavior of importing canvases as GPUExternalTextures #1670
- What happens in configureSwapChain and on canvas resize? #1691
- Add initial color spaces #1690
- Agenda for next meeting
- Google
- Austin Eng
- Brandon Jones
- James Darpinian
- Kai Ninomiya
- Ken Russell
- Shrek Shao
- Microsoft
- Rafael Cintron
- Mozilla
- Dzmitry Malyshau
- Jeff Gilbert
- Francois Guthmann
- Mehmet Oguz Derin
- Rick Battagline
Pluralize requestAdapters() and add GPUAdapter.isSoftware #1660
Add external_texture luminance functions #1681
- Last week's notes
- KR: would lean toward deferring this / doing it later. It's pretty complicated, would need emulation, would only have benefit in limited use cases.
- KN: agree - our media team has stated they don't really have any users requesting this.
- JG: what part of it exactly?
- KN: saving bandwidth by only looking at the Y plane. Not top thing we've heard from customers.
- JG: for this issue - punt to post-MVP?
- KN: yes, that's my opinion.
- DM: use case we'll be missing - someone runs edge detection filter and only interested in the luminance?
- KN: yes. You'd only do that to save bandwidth. Could do better job using color data. But for algorithms downsampling to 1 channel already, like segmentation algorithms, would be useful.
- KN: last week we agreed to have a non-precise version in for MVP.
- KR: at this point let's narrow the feature set and get a really good base shipped.
- KN / DM / JG: agreed.
- KN: recording this resolution in a comment.
Behavior of importing canvases as GPUExternalTextures #1670
- KN: lot of possible semantics. Could:
- copy-on-write
- clear canvas, in various different ways
- could lock canvas in some way, prevent it from being written to
- work only with OffscreenCanvas
- KN: first part of proposal: can only import 2d, webgl and webgl2 canvases. Think we should cut out importing other canvases now. Importing webgpu canvases, ImageBitmapRenderingContext, placeholder canvases for transferControlToOffscreen, etc. Think we don't want to handle those initially.
- KN: also proposing only allow importing OffscreenCanvases. Don't think that will fly, but want to know peoples' thoughts.
- Mainly because OffscreenCanvases don't have "present" semantics.
- DM: why would you want to import WebGL canvases?
- KN: interop. Have preexisting image processing pipeline, do some steps in WebGL, then others in WebGPU. Doesn't have to be super optimized, but some people are interested in this.
- DM: can't they already do this with a copy?
- KN: don't think they can do without a CPU copy.
- DM: we have copyFromExternalImage?
- KR: yes, that's optimizable to stay on the GPU.
- DM: if designed for transition of apps, but have to support it forever, that's some code that can be complicated. Wondering if it's necessary.
- KN: being able to import WebGL canvases?
- DM: yes.
- KN: think it might be OK to omit that. Have to talk with people about whether they think it's needed.
- DM: what do you expect in practice? Suppose you limit the import. Would you copy upon import?
- KN: don't think it would be hard for us to support it directly. Already have RGB texture underneath WebGL. Just import to WebGPU as external RGBA texture.
- DM: technically could say - if you want it to be performant, just write in WebGPU.
- KR: think we should work on the 1-copy path. Meet team needs this, btw.
- RC: I'm OK with the 1-copy path. Folks working with 2d canvas might want faster copying though. They've reported issues with perf to date.
- KN: we should work with people that have these perf issues. Probably good to have the 2d canvas import. WebGL we could exclude.
- JG: what are we punting, and to post-MVP? would be good to have 2d solution for v1. the rest, webgl, could be in response to people who want it. Maybe good thing to discuss post-MVP. Which won't be in too long.
- DM: so postpone both?
- JG: that's my proposal.
- KN: totally fine with me. Thought a lot about this, complicated, makes me not want to do it.
- DM: how would sharing semantics work in 2d? Skia, based on Dawn, shares same texture IDs?
- KN: think Skia will be on Dawn or Vulkan, or Metal. Maybe D3D12. Not sure whether Skia will use Dawn or not. Maybe platform API, maybe Dawn. Have ways to import textures from GL and Vulkan, and so on, into Dawn. We use those for a bunch of things, canvas swap chains, etc.
- KR: issues like: importing then drawing to 2d canvas, what do you do?
- KN: think copy-on-write is the easiest thing. Bit of work. Other proposal: always clear the source canvas when you import. It doesn't come back. Not great; 2d canvas apps frequently expect contents to stick around forever. Depends on the use cases. If drawing 1 string, may not care, but entire UI, it's a problem.
- KR: let's get copyExternalImageToTexture working well, in all browsers, and tested.
- JG: agreed.
- DM: so importing will be just for videos for MVP?
- KN: yes.
- KN: took resolution down in the issue.
What happens in configureSwapChain and on canvas resize? #1691
- KN: juamnihd@ from Chrome Canvas team posted summary of what happens in Canvas today. I put together proposal about what should happen. That's the comment linked above.
- JG: 2d canvas resets a lot of state. WebGL just resizes, but activates the back buffer clearing code paths.
- JG: one key - when you resize the canvas, we don't publish the previous frame.
- JG: might be better if you have to reconfigure the swap chain. If swap chain's invalid, it doesn't do the publishing.
- KN: so resize would invalidate the swap chain, which would indirectly invalidate the texture? Rather than keeping swap chain jumping to texture? That's the other option - don't see a lot of good ones. Those possibilities seem all right.
- JG: difference is - if i change canvas width/height, i set some device/canvas swap chain to new swap chain; and in rendering loop, i say "whatever swap chain's attached to the canvas is the one i use". It's the way the other system works, just that you'd use the same swap chain object.
- KN: only argument in favor of the way I wrote - unnecessary busy work. Our swap chains aren't really swap chains in the native sense. If they were, obviously we'd have to recreate them. If we want to preserve native semantic we can. But right now nothing about the swap chain cares about the size. Just configured with format, etc. No real reason we need to invalidate it. We don't return the same texture over and over again; new textures any time.
- BJ: in WebGL, size of canvas onscreen and size of implicit back buffer are detached. Based on actual width/height of canvas. Can keep stretch blitting of canvas. When you configure swap chain initially, it has its own internal width/height, and it adopts the canvas's width/height? Then later resizes independently?
- KN: reason I wouldn't do that - we already have width/height attributes on canvas, and you can change them. Wouldn't want: you can set canvas.width to something, expect something to happen, and it doesn't.
- BJ: agree. Most of this revolves around spooky action at a distance. Only benefit of having an explicit resize.
- JG: fine to do the thing that doesn't match what we do in native. Leave swap chain configured, and if we want to configure with explicit number of buffers, explicit widths/heights, etc., we can make different function calls.
- KN: if we didn't have the canvas width/height already I'd put this in configureSwapChain. Or put the canvas in a new mode where we delete the width/height properties when you create a WebGPU context.
- JG: they tie into the style/layout system. That's the hardest part. Don't remember whether we spec'd this, but what do we do for too-large requests?
- KN: that's another thing I need to think about. Proposals welcome. I don't love that. Would be great if we could say, sorry, couldn't create that swap chain, it's too big. Moreso than in WebGL - if your canvas dimensions aren't what you expect, things will go south. Literally Friday I messed this up, set canvas width/height to something * devicePixelRatio, and everything broke because my constant width/height were wrong. WebGPU's a bit more fragile than WebGL in this sense.
- BJ: if you want to reject creation of the swap chain due to bad dimensions, then can't respond automatically to size changes. Probably don't want swap chain breaking after creation.
- KN: semantic we could use - configureSwapChain takes width/height. Those are drawing buffer size. canvas width/height does nothing except set width/height for layout.
- JG: I do like that. Would like to see proposal for it.
- KN: I like it too.
- DM: so width/height in swap chain descriptor are nullable? Or required?
- BJ: would like them to inherit the canvas dimensions. Feels reasonably easy way to onboard people expecting WebGL behavior a bit more.
- KN: think it'd be OK if they inherit. Wondered about WebGL semantic of too-large size. Would just reject swap chain creation though.
- BJ: we don't have analogue to drawingBufferWidth/Height in WebGL, right?
- JG / KN: no.
- BJ: something like that would be needed to keep the implicit size matching. But if setting this, and it might fail, easier to manage, and less state.
- JG: sort of orthogonal. Should be easy to have swap chain width/height. WebGL-style feature for too-large canvases.
- BJ: also easy for libraries taking an incoming swap chain and rendering to them. They need access to the width/height.
- KN: this is the reflection question we have open.
- KN: KR any feedback?:
- KR: Think this is a good direction. Seems easier to manage. Already doing scaling from the drawing buffer size to the CSS size. There was some back and forth in the past on apps responding to drawingBufferWidth/Height in WebGL. We thought about automatically setting the OpenGL viewport, for example, but tried it and determined we couldn't. Better for apps to do less upon resize.
- DM: sounds good.
Add initial color spaces #1690
- KN: defines the color space enum, copied from CanvasColorSpaceProposal. Point to that eventually. Defines only one color space - sRGB. Adds a bunch of issues after this has landed - using those color spaces.
- KN: gist: when converting between color-managed and bytes in texture, bytes will be tagged with color space. e.g. import, I want the values sampled to be in this color space. copy, convert to the color space you've tagged the destination with. This is a pretty simple / elegant solution. We treat inputs/outputs to WebGPU as color managed to sRGB by default. If you don't want that, can take your image source and convince the browser it's sRGB. Can do this with ImageBitmap colorSpaceConversion="none". Puts everyone on the portable / consistent behavior path. But easy to opt out of.
- KR: note that some of this has already landed in the whatwg / html spec:
-
https://github.com/whatwg/html/pull/6562
- No longer necessary to refer to CanvasColorSpaceProposal for this functionality specifically.
- KN: maybe we'll just add the sRGB enum right now, and think about display-p3 later.
- RC: what about videos that aren't sRGB?
- KN: right now you can tag what the source is. If not sRGB, we'll define the video's color space -> sRGB. CSS Color Level 4 defines this for example. There'll always be a conversion if it's not the same color space.
- RC: OK. So if you don't tag anything, no conversion happens? Or it'll convert to this?
- KN: if no tag, defaults to sRGB.
- RC: it'll default to what you typed here?
- KN: the import texture will have a colorSpace field defaulting to "sRGB".
- JG: also the underlying resource is tagged as sRGB. If you have existing resource in p3, and request it to be sampled in sRGB, it'll be converted.
- KN: yes. Not sure about extended values.
- RC: so this is only used when the browser's compositing the canvas to the page?
- KN: no, it works on import too. Videos might be tagged with colorspace profile. We'll convert from that to sRGB by default. If you have target color space we'll do that conversion instead.
- JG: if the video's in rec.709 we'll give it to you in that form, etc.
- KN: have to go through ImageBitmap for that. That only works with copyToTexture.
- JG: would be nice to have
- KN: can talk about adding it.
- RC: if we add another predefined enum - when will that be taken into consideration upon import?
- KN: if we add another you can use it.
- RC: say we add another one and the web dev uses it. When will that be used when color-managing the video? When you import this video and don't tell it anything (color space) in the dictionary?
- KN: item in dictionary defines the colorspace you get your samples in, when you sample the thing.
- RC: so importing this never looks at this predefined color space?
- KN: can think of it like - the importing doesn't do it, but the sampling does. Doesn't affect how the video's imported. We get some things from the video, YUV encoded, rec.2020, and we do the conversion from whatever that is to an sRGB value.
- RC: so you mean rather than sRGB, the value the person spec'd in this enum?
- KN: yes.
- RC: does this affect the color space when drawing the canvas to the page?
- KN: yes. That's in there as a TODO. When defining a swap chain, can say what color space it'll be treated as. Just haven't hooked that in yet. Fundamentally the same thing: I have some real numbers, what colors do they represent. Defines fat sampler's numbers, and how browser should interpret the swap chain texture.
- RC: OK.
- KR: Kai ran a great "CTS-a-thon" for the Chrome team members last week
- KN: Got a few new CTS tests written, and more people up to speed on how to contribute
- Ken tested large dispatch compute calls, for example
- How can we get more people contributing actively to the CTS? It's a lot of fun and very rewarding to see new functionality being exercised!
- Do people have potential contributors on their teams?
- Also looking for WGSL tests.
- Not sure Firefox's status of running the wpt tests. Some hiccups around error scopes, but other than that working well.
- KN: do people foresee being able to put in effort to the CTS?
- DM: maybe we can find someone for the summer of code. Me, unlikely for the next few months.
- KN: would you be able to support getting things running, so people can run against FF?
- DM: of course.
- KN: anyone, let me know if you can contribute. I can guide you on what needs to be done. I have all the work tracked.