The white furnace test

The white furnace test is one of my favourite rendering debug tools. But before it was so, it was rather mysterious and abstract to me. Why would a publication proudly show what seemed like empty renders? What does it mean, and why would they care?

Slide from the presentation Revisiting Physically Based Shading at Imageworks, in which a white furnace test of the diffuse term is shown.
What’s up with the empty grey rectangle? The fact that looks empty is the point.
Revisiting Physically Based Shading at Imageworks, presented at the SIGGRAPH 2017 course: Physically Based Shading in Theory and Practice.

The idea is the following: if you have a 100% reflective object that is lit by a uniform environment, it becomes indistinguishable from the environment. It doesn’t matter if the object is matte or mirror like, or anything in between: it just “disappears”.

Accepting this idea took me a while, but there is a real-life situation in which you can experience this effect. Fresh snow can have an albedo as high as 90% to 98%, i.e. nearly perfect white. Associated with overcast weather or fog, it can sometimes appear featureless and become completely indistinguishable from the sky, to the point you’re left with skiing by feel because you can’t even tell the slope two steps in front of you. Everything is just a uniform white in all directions: the whiteout.

Photo taken on a ski track. The ground appears almost uniformly white.
Last time I visited a white furnace test. Note how the snow surface slope and details are almost invisible, and the sign in the background seems to be floating in the air.

With the knowledge that a 100% reflective object is supposed to look invisible when uniformly lit, verifying that it does is a good sanity test for a physically based renderer, and the reason why you sometimes see those curious illustrations in publications. It’s showing that the math checks out.

Those tests are usually intended to verify that a BRDF is energy preserving: making sure that it is not losing or adding energy. A typical topic for example is making sure materials don’t look darker as roughness increases and inter-reflections become too significant to be neglected. Missing energy is not the only concern though, and a grey environment (as opposed to a white one) is convenient as any excess of reflected energy will appear brighter than it.

Demonstration of the white furnace test on ShaderToy, or an expensive way to render an empty image. Press the play button to see the scene revealed.

But verifying the energy conservation of a BRDF is just one of the cases where the white furnace test is useful. Since a Lambertian BRDF with an albedo of 100% is perfectly energy preserving and completely trivial to implement, the white furnace test with such a white Lambert material can be used to reveal bugs in the renderer implementation itself.

There are so many aspects of the implementation that can go wrong: the sampling distribution, the proper weighting of the samples, a mistake in the PDF, a pi or a 2 factor forgotten somewhere… Those errors tend to be subtle and can result in a render that still looks reasonable. Nothing looks more like a correct shading than a slightly incorrect one.

So when I’m either writing a path tracer or one of its variants, or generating a pre-convolved environment map, or trying different sampling distributions, my first sanity check is to make sure it passes the white furnace test with a pure white Lambertian BRDF. Once that is done (and as writing the demonstration shader above showed me once again, that can take a few iterations), I can have confidence in my implementation and test the BRDF themselves.

Take away: the white furnace test is a very useful debugging tool to validate both the integration part and the BRDF part of your rendering.

Update: A comment on Hacker News mentioned that it would be useful to see an example of what failing the test looks like. So I’ve added a macro SIMULATE_INCORRECT_INTEGRATION in the shader above, to introduce a “bug”, the kind like forgetting that the integration over an hemisphere amounts to 2Pi or forgetting to take the sampling distribution into account for example. When the “bug” is active, the sphere becomes visible because it doesn’t reflect the correct amount of energy.

Physically based shading references, at the end of 2019

A lot has happened in the graphics community in the last ten years, especially when it comes to physically based rendering (PBR). It started to become popular around 2009, as hardware made more powerful models affordable, and really took over between 2010 and 2014. Real-time engines started replacing Phong and Blinn-Phong with a normalized Blinn-Phong, until pretty much everyone switched to GGX and its long trailing reflections. Researchers explored how to make it work with image based lighting (IBL), area lights, and it seems that nowadays everyone is looking at the secondary bounce problem.

I am not sure why adoption also happened at the same in the film industry (instead of much earlier), despite having different constraints than real-time. Films made before 2010 were mostly ad hoc, until a wave converted nearly the entire industry to unbiased path tracing.

I gathered a first PBR reading list back in 2011, but since then, the community has collectively made strides of progress. I also have a better understanding of the topic myself. So I think it is time to revisit it with a new, updated (and unfortunately, longer) reading list.

However, covering the entire PBR pipeline would be way too vast, so I am going to focus on physically based shading instead, and ignore topics like physical lighting units, physically based camera or photogrammetry, even though some of the links cover those topics.

Note: If you see mistakes, inaccuracies or missing important pieces, please let me know. I expect to update this article accordingly during the next few weeks.

Courses and tutorials

  • Physically Based Shading in Theory and Practice (formerly “Practical Physically Based Shading in Film and Game Production”)
    2010, (no 2011?), 2012, 2013, 2014, 2015, 2016, 2017.
    This recurring SIGGRAPH course by the leading actors of the field is a fantastic resource and a must see for anyone interested in the topic. Naty Hoffman then Stephen Hill have been hosting on their websites the course material for several years. Some of the presentations are also available on Youtube.
  • Physically Based Rendering: From Theory To Implementation, Third edition, 2016, Matt Pharr, Wenzel Jakob, and Greg Humphreys
    As of 2018, the content of this reference book is entirely available online.
  • Implementation Notes: Runtime Environment Map Filtering for Image Based Lighting, 2015, Padraic Hennessy.
    Details how to implement the environment map filtering described in Karis and Lagarde publications (see below), then how to optimize it by reducing the number of samples thanks to importance sampling and rejecting samples that don’t contribute.
  • Image Based Lighting, 2015, Chetan Jaggi.
    Focused on specular reflections, the article presents the implementation of image based lighting (IBL) using the split sum approximation from Unreal Engine 4 (described below), and how to improve quality for several cases.
  • Physically Based Rendering Algorithms: A Comprehensive Study In Unity3D, 2017?, Jordan Stevens.
    This tutorial explains what the different parts of the Bidirectional Reflection Distribution Function (BRDF) mean, lists many available bricks, and shows them in isolation. It is directed at Unity, but translates easily to other environments.
  • LearnOpenGL’s PBR series (theory, lighting, diffuse irradiance, specular IBL), 2017, Joey de Vries.
    An excellent introduction that explains the basics and walks the reader through the implementation of a shader based on the same model as Unreal Engine 4 (detailed below). There seems to be a confusion between albedo and base colour, but it’s otherwise clear and well structured.

Real world references