Today this tweet came to my attention and I think it is worth sharing:
— Julian Fong (@levork) June 13, 2015
Earlier this year, Adrian Courrèges wrote an article detailing his findings while reverse engineering the rendering pipeline in Deus Ex: Human Revolution.
Starting from a given frame, he illustrates the different stages in the rendering: creation of the G buffer, shadow map, ambient occlusion, light prepass, how opaque and transparent objects are treated differently, volumetric lights, bloom effect in LDR, anti-aliasing and color correction, the depth of field, and finally the object interaction visual feedback.
Here are a few screenshots stolen from his article:
Revision 2015 took place last month, on the Easter weekend as usual. I was lucky enough to attend and experience the great competitions that took place this year; I can’t recommend you enough to check all the good stuff that came out of it.
Like the previous times I shared some insights in a seminar, as an opportunity to practice public talking. Since our post-processing have quite improved with our last demo (Ctrl-Alt-Test : G – Level One), the topic was the implementation of a few post-processing effects in a real-time renderer: glow, lens flare, light streak, motion blur…
Having been fairly busy over the last months though, with work and the organising of Tokyo Demo Fest among others, I couldn’t afford as much time as I wanted to spend on the presentation unfortunately. An hour before the presentation I was still working on the slides, but all in all it went better than expected. I also experimented with doing a live demonstration, hopefully more engaging than some screenshots or even a video capture can be.
Here is the video recording made by the team at Revision (kudos to you guys for the fantastic work this year). I will provide the slides later on, after I properly finish the credits and references part.
Over decades photographers, then filmmakers, have learned to take advantage of optical phenomenons, and perfected the recipe of chemicals used in films, to affect the visual appeal of their images. Transposed to rendering, those lessons can make your image more pleasant to the eye, change its realism, affect its mood, or make it easier to read. In this course we will present different effects that can be implemented in a real-time rendering pipeline, the theory behind them, the implementation details in practice, and how they could fit in your workflow.
The Game Developers Conference took place last week in San Francisco. As I am starting to see more speakers publish their slides, I am creating this post to keep track of some them (this list is not meant to be exhaustive).
The Italian company CoeLux has apparently managed to create an artificial light which uses a material that mimics the atmospheric scattering, to look like sunlight and light from the sky. Judging from the photos and videos available on their website, it seems the look is very convincing.
As they point out, this could have a serious impact on architecture, as available sunlight is a factor in the design of buildings. Unfortunately, at this stage it is a prohibitively expensive product for the average consumer, and likely targeted at the construction industry. But it might be only a niche for some years before expanding to a larger scale.
The news websites PetaPixel and Colossal have previously covered this topic last month, and included some larger product photos made available by the company. But the most interesting coverage might be the one by Lux Review, who met CoeLux at an exhibition booth and made the following video. From their article:
No, the light source doesn’t move… yet. No, the colour temperature isn’t dynamic… yet. The void height needed is a metre. It consumes 340W of electrical power, but that will come down as LEDs improve.
John Carmack, the CTO of Oculus VR, gave a talk at the Game Developers Conference that just ended this week. Various topics are addressed, including the story behind Samsung’s Gear VR and what’s coming next, the democratization of virtual reality, the work on the API, the unsolved problem of controllers in VR, or the use of real-time ray tracing in VR.
It is a fairly long video (1h30), and as often with him, there are no pictures to see, just hear his personal views and insights on the work he is currently taking care of.
The Rescued film project aims at developing and archiving film rolls that, for some reason, were left undeveloped. In this video they present their work, and more specifically when they discovered 31 rolls shot my an American from WWII soldier.
The photos can be seen on their website.