Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No need to read the texture in the vertex shader. Render to a buffer object and use that as your vertex array. But broadly yes: that's the problem with OpenGL support, and WebGL in particular is still very bleeding edge.

I'd be pretty surprised if vertex texture fetch wasn't supported though. It works on basically all hardware from the PVR SGX on up. Unified shaders are pervasive on both phones and desktops. I did find this, though, which implies that for a while that the browsers weren't properly exposing support:

http://stackoverflow.com/questions/4349389/webgl-texture-acc...



> Render to a buffer object and use that as your vertex array.

I don't think that's an option in ES2 or WebGL. I'd love to be proven wrong!


FBOs are absolutely part of ES2, and thus presumably an official part of WebGL. I've used them in embedded contexts, but never in a browser. And as always, this is on the bleeding edge of what the drivers are prepared for, so dragons may lurk. But at least in principle it should work.


I've seen talk of CPU-less render-to-vertexbuffer dating back to 2004, but I've never dug into how to actually do it until now. From what I can dig out, it requires PBOs which are not available in ES2. I guess copying back and forth over the bus via gl.bufferData(ARRAY_BUFFER, gl.readPixels(...), gl.STREAM_DRAW)) is still better than doing the math in JS. I might have to try combining the glReadPixels with the mapped buffer extension that is available on the iPhone P:




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: