At this year’s SIGGRAPH, the USC Institute for Creative Technologies and Imperial College London presented a technique for “synthesizing the effects of skin microstructure deformation by anisotropically convolving a highresolution displacement map to match normal distribution changes in measured skin samples.“
What this basically means is that this new technique will bring us closed to realistic dynamic skin, and more believable characters.
As the USC Institute for Creative Technologies and Imperial College London noted, this technique can be used in real-time as the paper and the following videos include both realtime renders done with GPU shaders and offline renders.
Whether developers will take advantage of this tech remains to be seen. However, what ICTGraphicsLab has managed to achieve is really impressive.