Reverse engineering the rendering of a frame in Deus Ex: Human Revolution

Earlier this year, Adrian Courrèges wrote an article detailing his findings while reverse engineering the rendering pipeline in Deus Ex: Human Revolution.

Starting from a given frame, he illustrates the different stages in the rendering: creation of the G buffer, shadow map, ambient occlusion, light prepass, how opaque and transparent objects are treated differently, volumetric lights, bloom effect in LDR, anti-aliasing and color correction, the depth of field, and finally the object interaction visual feedback.

Here are a few screenshots stolen from his article:

Normal map

The light pre-pass

Final image

Update:
Adrian since then posted a new article, this time breaking down the rendering of a frame in Supreme Commander. The comments also include insights from the programmer then in charge of the rendering, Jon Mavor.

A real-time post-processing crash course

Revision 2015 took place last month, on the Easter weekend as usual. I was lucky enough to attend and experience the great competitions that took place this year; I can’t recommend you enough to check all the good stuff that came out of it.

Like the previous times I shared some insights in a seminar, as an opportunity to practice public talking. Since our post-processing have quite improved with our last demo (Ctrl-Alt-Test : G – Level One), the topic was the implementation of a few post-processing effects in a real-time renderer: glow, lens flare, light streak, motion blur…

Having been fairly busy over the last months though, with work and the organising of Tokyo Demo Fest among others, I couldn’t afford as much time as I wanted to spend on the presentation unfortunately. An hour before the presentation I was still working on the slides, but all in all it went better than expected. I also experimented with doing a live demonstration, hopefully more engaging than some screenshots or even a video capture can be.

Here is the video recording made by the team at Revision (kudos to you guys for the fantastic work this year). I will provide the slides later on, after I properly finish the credits and references part.

Abstract:
Over decades photographers, then filmmakers, have learned to take advantage of optical phenomenons, and perfected the recipe of chemicals used in films, to affect the visual appeal of their images. Transposed to rendering, those lessons can make your image more pleasant to the eye, change its realism, affect its mood, or make it easier to read. In this course we will present different effects that can be implemented in a real-time rendering pipeline, the theory behind them, the implementation details in practice, and how they could fit in your workflow.

GDC 2015 presentations

The Game Developers Conference took place last week in San Francisco. As I am starting to see more speakers publish their slides, I am creating this post to keep track of some them (this list is not meant to be exhaustive).

For a more extensive list, Cédric Guillemet has been garnering links to GDC 2015 papers on his blog.

John Carmack on Oculus at GDC 2015

John Carmack, the CTO of Oculus VR, gave a talk at the Game Developers Conference that just ended this week. Various topics are addressed, including the story behind Samsung’s Gear VR and what’s coming next, the democratization of virtual reality, the work on the API, the unsolved problem of controllers in VR, or the use of real-time ray tracing in VR.

John Carmack’s GDC 2015 talk.

It is a fairly long video (1h30), and as often with him, there are no pictures to see, just hear his personal views and insights on the work he is currently taking care of.

Issues when rendering cones on a GPU

Eric Haines reminds in a blog article the problems that arise when rendering a cone, which are surprisingly tricky for such a seemingly simple task. A discussion on the topic also spawned on Twitter.

Illustration of texture mapping issues.

Relativistic and non euclidean space rendering

The Portal series built a full game concept out of non euclidean spaces. Besides being great games, I think it is fascinating how true the tagline “Now you’re thinking with portals” is.

Here are two interesting experiments putting the person in different spaces than we are used to due to real world conditions. This video by Varun Ramesh demonstrates a non-euclidean ray tracer:

This other video by the MIT Game Lab demonstrates OpenRelativity, a Unity toolkit allowing simulation of navigation at relativistic speeds, used for the prototype game A Slower Speed of Light:

Update: Sylvain mentioned in comments that Carl Sagan explains those effects in the following video:

Billboards and particle rendering

Geeks3D recently posted a couple of posts on billboards rendering using vertex shader or geometry shader, and particle rendering performance when using point sprites or geometry shader.