Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

Pretty sure WebGL2 doesn't have true compute shader support. It's more a jack where you can write to a texture buffer with a fragment shader to imitate a compute shader. True compute shader support is supposedly in WebGPU


Correct, the Chrome team is responsible for killing the WebGL 2.0 Compute shaders effort with the reasoning WebGPU is just around the corner.

https://github.com/9ballsyndrome/WebGL_Compute_shader/issues...

Now five years later, how well are those WebGPU compute shaders adoption going on?


Not bad at all! WebGPU has compute shaders in the base spec, and it looks like 66%+ of internet traffic already supports it: https://web3dsurvey.com/webgpu

By the way, if you are running any high-traffic websites you can donate your users' device data to web3dsurvey (with a simple JS snippet). I'm sure it will be appreciated.


Meaning anyone using ChromeOS, formerly known as The Web.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: