Storing data in textures
Sampling animations is not a trivial task. There are a lot of loops and functions, which makes animation sampling on the GPU a difficult problem. One way to address this problem is to simplify it.
Instead of sampling an animation in real-time, it could be sampled at set time intervals. The process of sampling an animation at set intervals and writing the resulting data to a file is called baking.
Once the animation data is baked, the shader no longer has to sample an actual animation clip. Instead, it can look up the nearest sampled pose based on time. So, where does this animation data get baked to? Animation can be baked into textures. Textures can be used as data buffers, and there is already an easy way to read texture data in shaders.
Normally, the storage type and information in a texture is abstracted away by the sampling function in the shader. For example, the texture2D
function in GLSL takes normalized uv
coordinates as an argument and...