You can find (the full source of) the fragment shader here: https://www.shadertoy.com/view/XtcyW4.
The theme for this year competition was “humans”. A distinctive trait of humans is the way they move. It is very easy for us to pick the human out of a row of dancing animals. That is why I choose to focus on animation for this shader.
To get a natural movement, I wanted to use an animation based on motion capture. In general, an animation contains a lot of data, so I had to find out a way to compress this data to make it possible to store the animation in the code of the fragment shader. In the end, a Fourier transform gave me the compression I needed.
To create the shader above, I preprocessed the animation by taking the Fourier transform of the position of all bones (14) for all frames of the animation (760). Only a fraction of all the calculated coefficients are actually stored in the shader: the first coefficients with 16-bit precision, later coefficients with 8 bit. This way the amount of data is greatly reduced, but I still have the precision needed to create a recognizable dancing human.
When the shader is running, the positions of the bones are reconstructed each frame by taking the inverse Fourier transform of this data (this is done in Buffer A).
I have used (part of) an animation from the Carnegie Mellon University Motion Capture Database. The animations of this database are free to use.
Image Based Lighting (IBL) is used to render the scene. Have a look at my shader “Old watch (IBL)” for a clean implementation of IBL.
If you like this post, you may also like one of my other posts:
- Image-Based Lighting
- Augmented Reality and Shadertoy
- Raymarching distance fields
- A shader quine
- More spheres