Arbitrary Computation on the GPU Using WebGL


WebGL is already widely in use for 3D graphics and image processing but could be used for much more. We'll learn how to pass arbitrary data to the GPU for parallel processing, how to get that processed data back into Javascript, and all the reasons this doesn't work even when it should.

This talk starts with a review of WebGL shaders and how data is passed through the WebGL pipeline and processed in parallel. Next I'll show some examples of traditional image processing using WebGL. After that, I'll explain how to pass arbitrary data into the GPU and how to retrieve output from those parallel calculations. Finally, there will be an example of this system in action, and then several examples that don't work as expected with explanations of the limitations of the current WebGL architecture.

Code of Conduct

We are an inclusive, kind community that is constantly growing. Please find our Code of conduct, and try and make your fellow Cascadians (& fellow programmers) feel welcome!