Back

Real-time ray tracing explained


Ray tracing is being recognised as a significant leap in the world of game engine technology because it makes light in real-time virtual environments behave like it does in real life. Employed in movie VFX for years and requiring significant amounts of compute power, it uses an algorithm to trace the path that a beam of light would take in the physical world so it appears to bounce off objects, cast realistic shadows, and create lifelike reflections. Whereas a film frame that includes ray tracing can take up to a day to render, real-time environments need to achieve the effect at 60 or 120 fps in 16 milliseconds.

PCs with fast GPUs, typically from the NVIDIA TX range, have the rendering power to produce ray-traced scenes on the fly, replacing predictable, baked-in effects with genuinely realistic, dynamic reflections, shadows, blooms and lens flares.

How it works

In real life, light sources emit waves of photons that bounce off surfaces before reaching our retinas, allowing our brains to combine them into one complete picture. With ray tracing, which uses denoising, only the light paths the viewer needs to see are traced and rendered. Light moves outward from the viewer / camera lens perspective, bouncing off objects and surfaces, sometimes adopting their colour and reflective properties, until the appropriate light sources that would affect each ray are determined. Different classes of denoisers are used to assemble the final image.

Making the most of ray tracing in real-time visualisation is one of Mondatum’s particular areas of expertise. To find out how we can help you with your plans for this exciting new area of technology, get in touch – contact@mondatum.com.

Source: Wired



RELATED INSIGHTS