Blender will get OpenPGL for Apple Silicon soon. What is the use case for OpenPGL?
Attendees Brecht Van Lommel (Blender) Christophe Hery (Meta) Feng Xie (Meta) Brian Savery (AMD) Nikita Sirgienko (Intel) Patrick Mours (NVIDIA) Sebastian Herholz (Intel) Stefan Werner (Intel) Sonny Campbell (Unity) Notes #100569 - USD Layers referencing, MaterialX, and USD Hydra rendering...
devtalk.blender.org
Whenever a ray hits an object, you have to bounce it again in a new "random" direction. This is why 3D renders are so noisy, because if you only have 1 sample one pixel might bounce off and hit a light (and gets really light), while the neighbour pixel bounces off into space (and is rendered as black).
The more samples you take and then average out, the closer this noise converges on the "correct" answer.
The problem is that just randomly picking a direction to bounce in kinda sucks because then you have to have lots and lots and lots of samples to converge on the "true" values.
It's much better if you can sort of scan the scene and say "well I know that there is a light here and the majority of the light hitting this point in space is coming from this light, so I'll try to fire most of my samples off towards that light instead of just randomly."
This is even more important when it comes to caustics because then light can literally bounce everywhere and it can take such an incredible number of samples to converge on the "truth"
OpenPGL seems to be a library for helping samples bounce in directions that will find the most important items in a scene first, which allows a render to avoid the noise in the first few samples caused by light just randomly bouncing off into the void.