You can find (the full source of) the fragment shader here: https://www.shadertoy.com/view/XtcyW4.
The theme for this year’s competition was “humans.” A distinctive trait of humans is the way they move. It is very easy for us to pick a human from a row of dancing animals. That is why I choose to focus on animation for this shader.
To get a natural movement, I wanted to use an animation based on motion capture. In general, an animation contains a lot of data, so I had to find out a way to compress this data to make it possible to store the animation in the code of the fragment shader. In the end, a Fourier transform gave me the compression I needed.
To create the shader above, I preprocessed the animation by taking the Fourier transform of the position of all bones (14) for all animation frames (760). Only a fraction of all the calculated coefficients are stored in the shader: the first coefficients with 16-bit precision and the later coefficients with 8 bits. This way, the amount of data is greatly reduced, but I still have the precision needed to create a recognizable dancing human.
When the shader is running, the positions of the bones are reconstructed in each frame by taking the inverse Fourier transform of this data (this is done in Buffer A).
I have used (part of) an animation from the Carnegie Mellon University Motion Capture Database. The animations of this database are free to use.
Image Based Lighting (IBL) is used to render the scene. Look at my shader “Old watch (IBL)” for a clean implementation of IBL.
If you like this post, you may also like one of my other posts:
- Image-Based Lighting
- Raymarching distance fields
- Augmented Reality and Shadertoy
- A shader quine
- Ray Tracing – Primitives