-
Notifications
You must be signed in to change notification settings - Fork 345
Replace ArrayBuffers with garbage-friendly alternatives #261
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
FYI, WebGL does this with (ArrayBufferView, srcOffset, length). This is also fine, it just makes it unclear whether srcOffset and length are in units of bytes or multiples of BYTES_PER_ELEMENT. |
|
I remembered BufferSource was a thing, and updated accordingly. |
|
Thinking about it more it doesn't really make sense to pass ArrayBuffers into these entry points, since ArrayBuffers don't semantically expose access to their contents (a view is required). Updated. |
|
|
|
ArrayBufferViews are not great for packed buffers, where you will be using DataView to fill in the contents. |
|
In practice, IIRC, Emscripten always keeps around one of each type of view into it's entire heap, and does reset them when the heap grows. It uses these views every time it needs to access the heap from JS, because it's the only way. Good point about DataView (which is especially useful for uniform buffers). I thought of it before but forgot when I made this change. Still, semantically I think it makes more sense to make a view into the buffer behind the DataView, for the reasons above. Or allow DataView to be passed in here. OTOH if we get rid of setSubData then it's moot; DataView isn't particularly useful for SPIR-V code. |
|
To me, I see the ArrayBuffer as an opaque blob of data ( You can always access the underlying ArrayBuffer from a view for free, but creating the view to wrap a view is garbage. So I would much prefer ArrayBuffer. |
|
Allowing only ArrayBuffer would be kind of unwieldy; if you do have an ArrayBufferView, it requires unpacking it into (ArrayBuffer, offset, size) manually in JS. Also, if createShaderModule only takes ArrayBuffer and not ArrayBufferView, we have to add the offset and size arguments to GPUShaderModuleDescriptor, and, again, users have to unpack them manually. As I said above, since the cost of createShaderModule is already so high, I think I'd prefer to stick with the web platform's existing primitives and take only ArrayBufferView. I actually would be fine with requiring this to be Uint32Array, as it is semantically correct. And if setSubData is gone (it kind of seems like it's going that way), this is the only entry point left for now. |
|
Sounds good then; thanks. Didn't realize that setSubData was heavily on the chopping block. |
|
Thanks for all your input. I'll be sure to re-raise this if we don't remove it. |
|
@kainino0x can you rebase this? Also shouldn't the variant for SPIR-V be Uint32Array? |
|
Rebased. But that said, I recall WebGL was adding entry points for (ArrayBuffer, offset) to make things work better when resizing WASM heaps. Should we take that approach? (cc @kenrussell) |
|
Chatted with Ken about this and we decided it's best to keep things simple for now and use Uint32Array+offset. In the future we might want to change it to BufferSource+offset, which would be a generalization. |
|
Er, per my previous comments, the cost of createShaderModule is so high compared with the cost of garbage collecting one Uint32Array wrapper, that it's nicer to just pass Uint32Array here instead of Uint32Array+offset+length. Ready for review/landing. Note this PR now only affects ShaderModuleDescriptor (as setSubData is gone), and only the SPIR-V version of it. |
|
I agree that better ergonomics are preferable to absolute perf/lightness
for cold/expensive entrypoints.
…On Wed, May 22, 2019 at 2:05 PM Kai Ninomiya ***@***.***> wrote:
Er, per my previous comments, the cost of createShaderModule is so high
compared with the cost of garbage collecting one Uint32Array wrapper, that
it's nicer to just pass Uint32Array here instead of
Uint32Array+offset+length.
Ready for review/landing. Note this PR now only affects
ShaderModuleDescriptor (as setSubData is gone), and only the SPIR-V version
of it.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#261?email_source=notifications&email_token=AALHJDITKM6XY76RHKPKY2TPWWYQLA5CNFSM4HCEKGM2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODWAK6PY#issuecomment-494972735>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AALHJDPJHMCEOAO3Y3TRJT3PWWYQLANCNFSM4HCEKGMQ>
.
|
kdashg
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds good to me, and to the rest of the WG.
Taking ArrayBuffer as input is problematic: if an application wants to provide a sub-range of an existing ArrayBuffer, it must make a copy of all of the data into a new ArrayBuffer.
setSubData: UsesArrayBufferBufferSource + srcOffset + size to avoid the garbage overhead of creating an ArrayBufferView.an ArrayBufferViewa BufferSourcea Uint32Array.