A list of path tracing shaders

I have gathered a list of path tracing shaders on ShaderToy.

Path tracing is a surprisingly simple technique to render realistic images. This would be my definition if you are unfamiliar with the term. But if you already have experience with various ray tracing techniques, I would probably say that path tracing is a remarkably elegant solution to the rendering equation. You can implement a toy path tracer in a weekend or, if you’ve already done it a few times before, within 25 minutes.

Recently I was documenting myself on path tracing, and some of the techniques that can be used, like next event estimation, bidirectional path tracing, Russian roulette, etc. This is a case where ShaderToy can be an invaluable source of examples and information, and so I was browsing path tracing shaders there. As the number of open tabs was starting to get impractical, I decided to use the “playlist” feature of ShaderToy to bookmark them all.

You can find the list here: Path tracing, on ShaderToy.

The examples of path tracers listed include very naive implementations, hacky ones, rendering features like advanced BRDF, volumetric lighting or spectral rendering, or various noise reduction techniques such as next event estimation, bidirectional path tracing, multiple importance sampling, accumulation over frames with temporal reprojection, screen space blue noise, or convolutional neural network based denoising.

Some of those shaders are meant to be artworks, but even the technical experimentation ones look nice, because the global illumination inherent to path tracing tends to generate images that are pretty.

Screenshot of the list on ShaderToy, with various kinds of path tracers visible.

Reading list on ReSTIR

Recently a short video from dark magic programmer Tomasz Stachowiak made the rounds in the graphics programming community, at the sound of jaws hitting the floor in its wake. It shows his recent progress on in his renderer pet project: beautiful real-time global illumination with fast convergence and barely any noise, in a static environment with dynamic lighting.

In a Twitter thread where he discussed some details, one keyword in particular caught my attention: ReSTIR.

ReSTIR stands for “Reservoir-based Spatio-Temporal Importance Resampling” and is a sampling technique published at SIGGRAPH 2020 and getting refined since.

The original publication

Spatiotemporal reservoir resampling for real-time ray tracing with dynamic direct lighting
The publication page includes the recording of the SIGGRAPH presentation, with a well articulated explanation of the technique by main author Benedikt Bitterli.
(same publication hosted on the NVidia website).

Explanations of ReSTIR

Improvements over the original publication

After the initial publication, NVidia published a refined version producing images with less noise at a lower cost, which they call “RTXDI” (for RTX Direct Illumination).

Other limitations

When discussing on Twitter some of the limitations of ReSTIR, Chris Wyman made the following remarks:

To be clear, right now, ReSTIR is a box of razor blades without handles (or a box of unlabeled knobs). It’s extremely powerful, but you have to know what you’re doing. It is not intuitive, if your existing perspective is traditional Monte Carlo (or real-time) sampling techniques.

People sometimes think SIGGRAPH paper = solved. Nope. We’ve learned a lot since the first paper, and our direct lighting is a lot more stable with that knowledge. We’re still learning how to do it well on full-length paths.

And there’s a bunch of edge cases, even in direct lighting, that we know how to solve but haven’t had time to write them up, polish, and demo.

We haven’t actually tried to solve the extra noise at disocclusions in (what I think of as) a very principled way. Right now a world-space structure is probably the best way. I’m pretty sure it can be done without a (formal) world-space structure, just “more ReSTIR.”

Physically based shading references, at the end of 2019

A lot has happened in the graphics community in the last ten years, especially when it comes to physically based rendering (PBR). It started to become popular around 2009, as hardware made more powerful models affordable, and really took over between 2010 and 2014. Real-time engines started replacing Phong and Blinn-Phong with a normalized Blinn-Phong, until pretty much everyone switched to GGX and its long trailing reflections. Researchers explored how to make it work with image based lighting (IBL), area lights, and it seems that nowadays everyone is looking at the secondary bounce problem.

I am not sure why adoption also happened at the same in the film industry (instead of much earlier), despite having different constraints than real-time. Films made before 2010 were mostly ad hoc, until a wave converted nearly the entire industry to unbiased path tracing.

I gathered a first PBR reading list back in 2011, but since then, the community has collectively made strides of progress. I also have a better understanding of the topic myself. So I think it is time to revisit it with a new, updated (and unfortunately, longer) reading list.

However, covering the entire PBR pipeline would be way too vast, so I am going to focus on physically based shading instead, and ignore topics like physical lighting units, physically based camera or photogrammetry, even though some of the links cover those topics.

Note: If you see mistakes, inaccuracies or missing important pieces, please let me know. I expect to update this article accordingly during the next few weeks.

Courses and tutorials

  • Physically Based Shading in Theory and Practice (formerly “Practical Physically Based Shading in Film and Game Production”)
    2010, (no 2011?), 2012, 2013, 2014, 2015, 2016, 2017.
    This recurring SIGGRAPH course by the leading actors of the field is a fantastic resource and a must see for anyone interested in the topic. Naty Hoffman then Stephen Hill have been hosting on their websites the course material for several years. Some of the presentations are also available on Youtube.
  • Physically Based Rendering: From Theory To Implementation, Third edition, 2016, Matt Pharr, Wenzel Jakob, and Greg Humphreys
    As of 2018, the content of this reference book is entirely available online.
  • Implementation Notes: Runtime Environment Map Filtering for Image Based Lighting, 2015, Padraic Hennessy.
    Details how to implement the environment map filtering described in Karis and Lagarde publications (see below), then how to optimize it by reducing the number of samples thanks to importance sampling and rejecting samples that don’t contribute.
  • Image Based Lighting, 2015, Chetan Jaggi.
    Focused on specular reflections, the article presents the implementation of image based lighting (IBL) using the split sum approximation from Unreal Engine 4 (described below), and how to improve quality for several cases.
  • Physically Based Rendering Algorithms: A Comprehensive Study In Unity3D, 2017?, Jordan Stevens.
    This tutorial explains what the different parts of the Bidirectional Reflection Distribution Function (BRDF) mean, lists many available bricks, and shows them in isolation. It is directed at Unity, but translates easily to other environments.
  • LearnOpenGL’s PBR series (theory, lighting, diffuse irradiance, specular IBL), 2017, Joey de Vries.
    An excellent introduction that explains the basics and walks the reader through the implementation of a shader based on the same model as Unreal Engine 4 (detailed below). There seems to be a confusion between albedo and base colour, but it’s otherwise clear and well structured.

Real world references

In depth overviews

The following publications all describe the work done by teams who had to do an inventory of the existing options, and choose a model for their particular needs.

  • Physically-Based Shading at Disney (slides), 2012, Brent Burley et al.
    What came to be known as the “Disney BRDF” was a milestone in PBR literature, and a reference many other works are built upon. It compares different existing models to the MERL database (see previous section), notes their strengths and weaknesses, discusses in length the observed behaviour, especially the diffuse response at grazing angles, and proceeds to define their own empirical shading model to mimic that behaviour. The Disney BRDF is designed to be robust and expressive but also simple and intuitive for artists.
    In the annex, a brief overview of the history of BRDF is given.
    The publication proposes a tool, the BRDF Explorer, to visualize and compare analytic BRDF models or measured ones.
    (In 2015, a follow up publication extended their model to a full BSDF in order to support refraction and scattering, but this falls out of the scope of this already long list.)
  • Real Shading in Unreal Engine 4 (slides), 2013, Brian Karis.
    Strongly inspired by the Disney BRDF, it presents a similar shading model. It prefers a simple Lambert diffuse BRDF due to both the cost and the integration with spherical harmonics, and uses other approximations for realtime. The course notes mention that a lot of work was done to compare the various available bricks, but doesn’t list them. Karis lists them in a separate publication, listed next.
    For image based lighting, the famous “split sum” approximation of the integral is introduced, allowing to convolve a part of the integral, and precompute the rest in a 2D look-up table (LUT).
    When explaining how the workflow adapted to this new model, the course notes stress the importance of having linear parameters for material interpolation.
    Warning: I was told a year ago by Yusuke Tokuyoshi that there was an error in a derivation, but my understanding of it is not sufficient to spot it. Apparently the error is only in the publication, and was fixed in the actual code though.
  • Specular BRDF Reference, 2013, Brian Karis.
    Lists various available bricks for the Cook-Torrance BRDF, using the same naming convention.
  • Moving Frostbite to Physically Based Rendering 3.0, 2014, Sébastien Lagarde.
    The biggest publication in this list, with over 120 pages of course notes. I haven’t finished reading it yet, but this is an outstanding piece of work, that goes deep into details for many of the aspects involved.
  • Physically Based Rendering in Filament, 2018, Romain Guy et al.
    This documentation presents the shading model used in Filament, the choices that were made, which are similar to Frostbite in many ways, and the alternatives that were available. The quality of this document is outstanding, and it seems it is becoming a reference for PBR implementations.
  • MaterialX Physically-Based Shading Nodes, 2019, Niklas Harrysson, Doug Smythe and Jonathan Stone.
    This specification is meant as a transfer format in the VFX industry. It describes a wide range of materials, not limited to BRDF, but also including emissive and volumetric materials, and allows to choose between a variety of such functions.
    Reading this document can help solidify or confirm the understanding of how all these different functions contribute to the rendering ecosystem. However I would only recommend it to readers who already have a fairly good understanding of the PBR models.

Disney BRDF and BRDF Explorer implementations

The Disney BRDF is so popular that many implementations can be found in the wild. Here are a few of them.

Diffuse

It seems that in litterature, diffuse BRDF are a lot less covered than specular ones. I suppose this is because it is harder to solve, while the low frequency nature of the diffuse component makes its quality less noticeable. Therefore, many realtime implementations consider the Lambert model sufficient. However, the following publications explore the topic.

  • Physically-Based Shading at Disney (slides), 2012, Brent Burley et al.
    One of the contribution of the “Disney BRDF” is its diffuse model. It compares several existing diffuse models with the measured data of the MERL database (see earlier section) but, unsatisfied with their response, it proposes its own, empirical one. One of the features of that model is the retroreflection at grazing angles.
    I have read multiple times that this model is not energy conserving. Yet Disney uses it for offline rendering, which I assume is path tracing (?), so I am not sure of what is the impact of that decision.
  • Moving Frostbite to Physically Based Rendering 3.0, 2014, Sébastien Lagarde.
    The diffuse BRDF described is a normalized version of the Disney BRDF to make it energy conserving.
  • Designing Reflectance Models for New Consoles (slides), 2014, Yoshiharu Gotanda.
    Gotanda explains here several weaknesses of the Oren-Nayar model for PBR (its geometry term is different than the one used for the specular term, and it’s not energy conserving), and proceeds to propose a modified version. Since there is no analytic solution, he suggests a fitted approximation.
    He also reminds his own improvement over Schlick’s Fresnel approximation, but concludes that both models fail for complex indices of refraction.
  • PBR Diffuse Lighting for GGX+Smith Microsurfaces, 2017, Earl Hammon, Jr.
    This presentation tries to combine the Oren-Nayar diffuse model (originally a Gaussian normal distribution) with the GGX normal distribution. It studies the Smith geometry fonction (G), proposes a BRDF to use for testing with path tracing, and concludes with an approximation for a diffuse GGX.
    On a side note, a few final slides give some identities that are useful for shader optimization.

Energy conservation

One of the pillars of PBR is to make sure to respect the energy conservation law. When designing a BRDF, there shouldn’t be more energy out than comes in. This is especially important for path tracing to converge. The following links explain how to take that constraint into account.

  • Energy Conservation In Games, 2009, Rory Driscoll.
    Explains briefly the problem of energy conservation, and details how to obtain the normalization factor for diffuse Lambert. It’s a good example of how to get started.
    The comments discuss the case of the Phong and Blinn-Phong specular lobe.
  • Phong Normalization Factor derivation, 2009, Fabian Giesen.
    Demonstrates the derivation to obtain the normalization factor for the Phong and Blinn-Phong specular lobe.
  • The Blinn-Phong Normalization Zoo, 2011, Christian Schüler.
    Lists various normalization factors that exist for variants of Phong and Blinn-Phong.
    Also proposes a crude approximation for Cook-Torrance.
  • How Is The NDF Really Defined?, 2013, Nathan Reed.
    Explains conceptually what the Normal Distribution Function (NDF) is, and how this affects the region to integrate over for normalization.
  • Adopting a physically based shading model, 2011, Sébastien Lagarde et al.
    Starts by reminding a few normalization factors (Lambert, Phong and Blinn-Phong). Includes a quick paragraph on the factor to use to combine diffuse with specular.
  • How to properly combine the diffuse and specular terms?, 2016, CG Stack Exchange.
    A question I candidly asked on how to combine diffuse and specular so the energy lost to specular is taken into account in the diffuse term.
  • Designing Reflectance Models for New Consoles (slides), 2014, Yoshiharu Gotanda.
    The third section explains that a Fresnel term should be taken into account for the diffuse part, but also why this is problematic. This Fresnel term should take into account all microfacets, not only the perfect reflection ones that contribute to the specular component.
    This is an answer to my Stack Exchange question above.
  • PBR Diffuse Lighting for GGX+Smith Microsurfaces, 2017, Earl Hammon, Jr.
    Among the other topics it covers, this presentation shows the derivation to normalize a BRDF.
  • Physically Based Shading at DreamWorks Animation, 2017, Feng Xie and Jon Lanz.
    In the appendix of these course notes, the derivation to normalize their fabric BRDF is shown.

Wrapped diffuse

It is a common trick in video games to represent certain diffuse materials that have a lot of scattering with a custom diffuse that “wraps” around and brings light in the shadowed part. When PBR became popular, several people looked into how to make their wrapped diffuse PBR compliant.

Multiple scattering

A recently tackled problem is the energy loss due to ignoring multiple scattering. In many BRDF models, rays occluded by geometry are simply discarded. This tends to cause a noticeable darkening as roughness increases, visible in many of the charts showing material appearance for various roughnesses. However the trend is changing and this is why we see more and more references to the “furnace test”, which is a way to highlight energy loss.

  • Multiple-Scattering Microfacet BSDFs with the Smith Model, 2016, Eric Heitz et al.
    I haven’t read that paper except for the abstract, but the reception it received indicates that it’s an important publication. Recently, Morgan McGuire even said about it:

    “It is such a beautifully complete piece of work, a short, careful, and clear book on microfacets of the form that typically only arrives out of a complete Ph.D. thesis.”.

    If I understand correctly, they extended the Smith model to take multiple scattering into account, and compared their results with a simulation, by raytracing a surface at the micro-facet level.
    Károly Zsolnai-Fehér of Two Minute Papers did a video abstract of their paper.

  • A Multi-Faceted Exploration, part 1, part 2, part 3, part 4  2018-2019, Stephen Hill.
    This series of articles explores the feasibility of using in real-time rendering a model used by Sony Pictures Imageworks for offline rendering. The first part explains and illustrates what the problem is. The second part presents the solution from Heitz, and uses it as a ground truth reference, before presenting the Sony Pictures Imageworks solution and comparing the two. It then proposes an improvement of the latter. The third part gives a brief and clear reminder of the idea behind the split integral technique from UE4 and others, and uses it to propose a further improvement by precomputing a 2D LUT (instead of a 3D one). The fourth part details the precomputation step and shows the results in a WebGL demo.
    The series is not concluded yet, so I imagine one or more articles are coming.
  • Advances in Rendering, Graphics Research and Video Game Production (PDF version, video), 2019, Steve McAuley.
    This presentation shows the steps that were involved in implementing multiscattering BRDF and area lights for diffuse and specular, in FarCry, which uses a complex rendering engine that has to support a variety of combination of cases. It’s a reminder that such task can become more involved that expected. It’s also a case for academic papers that highlight their main insight, and have code available.

Area lights

I haven’t explored this topic yet, but I bookmarked some publications that spent time on the topic.

  • Lighting of KillZone Shadowfall, 2013, Michal Drobot.
    A part of the presentation is dedicated to area lights. It observes that point lights are inadequate for artists and that they tend to tweak roughness to compensate. It then briefly explains the technique, which consist in analytically integrating over the area light. Unfortunately the full derivation is not shown.
  • Real Shading in Unreal Engine 4 (slides), 2013, Brian Karis.
    A part covers the area lights. Like Drobot, Karis observes a tendency of artists to use roughness to compensate the small reflection highlight of point lights. The course notes list their requirements, some solutions that were considered (including Drobot’s) and why they were rejected. They then present a method based on a “representative point”, and how it applies to spherical lights and tube lights.
  • Real-Time Polygonal-Light Shading with Linearly Transformed Cosines, 2016, Eric Heitz et al.
    The current state of the art. This technique approximates physically based lighting from polygonal lights by transforming a cosine distribution (which is simpler to integrate) so it matches the BRDF properties. A demo with the code as well as a WebGL demo showing the result are provided.

Acknowledgements

Many thanks to Calvin Simpson, Dimitri Diakopoulos, Jeremy Cowles, Jonathan Stone, Julian Fong, Sébastien Lagarde, Stefan Werner, Yining Karl Li and the computer graphics community at large for your contributions and suggestions of material to read.

The rendering tools in the film industry

Here is a list of articles published by fxguide, giving fascinating insights into the tools used by the film industry in terms of rendering.

  • Ben Snow: the evolution of ILM’s lighting tools (January 2011)
    A presentation of the evolution of the technology and tools used at Industrial Light and Magic, over the course of the years and movies, from the mid-90s to nowadays.
  • Monsters University: rendering physically based monsters (June 2013)
  • The Art of Rendering (April 2012)
    A description of the different techniques used in high end rendering and the major engines.
  • The State of Rendering (July 2013): part 1, part 2
    A lengthy overview of the state of the art in high end rendering, comparing the different tools and rendering solutions available, their approach and design choices, strengths and weaknesses as well as the consequences in terms of quality, scalability and render time.

(Brace yourselves for the massive tag list hereafter.)