Using the GPU from JavaScript

Everyone knows that writing programs that exploit the GPU (Graphics Processing Unit) in your computer’s video card requires special arcane tools, right? Well, thanks to [Matthew Saw], [Fazil Sapuan], and [Cheah Eugene], perhaps not. At a hackathon, they turned out a Javascript library that allows you to create “kernel” functions to execute on the GPU of the target system. There’s a demo available with a benchmark which on our machine sped up a 512×512 calculation by well over five times. You can download the library from the same page. There’s also a GitHub page.

The documentation is a bit sparse but readable. You simply define the function you want to execute and the dimensions of the problem. You can specify one, two, or three dimensions, as suits your problem space. When you execute the associated function it will try to run the kernels on your GPU in parallel. If it can’t, it will still get the right answer, just slowly.

You can get the results as an array or as a graphic on a canvas object. That should open up some interesting possibilities. However, you should know that there are limits to the kinds of JavaScript you can execute as part of the kernel function. You can read the documentation, but the upshot is the functions are pretty much numeric with if statement and for loops that have fixed limits.

There’s actually at least one other project that does this called turbo.js. It also has a GitHub site. It takes a slightly different approach, as the kernel is written in GLSL which is similar to C instead of JavaScript. It also requires you to code your own fallback to use if there is no GPU present.

There was a time buying a vector processor was a big purchase. Now there is one in your PC drawing your web pages. If you want a different approach you could always build your own cluster. You can even have a stack of Pis.


Filed under: Software Development, software hacks

from Hackaday http://ift.tt/2vwfiRb
via IFTTT