Can you elaborate on this? Is it different from "detecting if WebGL/WebGPU is supported"? Because I'm pretty sure you can detect that, according to all the answers on this thread:
It is supported doesn't mean that you get what you are expecting, if anything is black listed you will get a "supported" version with software rendering instead.
Naturally random joe user has no idea what is going on, and will dimiss your game as crap because it is running at e.g. 3 FPS.
Then even if everything is supported in hardware, as OpenGL ES 3.0 subset (defined in 2011), no matter how good the GPU is, there is only so much you can do.
Browser vendors want people to be able to experience rich 3D accelerated content in their browsers and they will be aware of these issues like the one you mention. The idea that it won’t be addressed is completely schizophrenic… how could it be a security risk for the client to do it’s own assessment of whether or not the available hardware is suitable for 3D content and then set a flag accordingly?
10 years of WebGL experience on the field prove otherwise.
There are browser flags to disable black listing, which is something that regular Joe/Jane has no idea whatsoever that they exist.
Besides, you can head off to webgpu.io and follow up on meeting minutes.
Even better, attend the upcoming WebGL/WebGPU meeting from Khronos (registration currently open) and pose that question if you prefer to hear the same from the browser vendors themselves.
it looks like this code detects whether or not the client is making webgl2 available, not the presence or amount of hardware acceleration that the client is giving you access to via webgl2.
But in this case webgpu is basically a waste of time… why would they bother creating it if it won’t end up being of any use? Couldn’t you assess performance with a dummy load anyway?
https://stackoverflow.com/questions/11871077/proper-way-to-d...