People on twitter are starting to mention that their Oculus DK2 are being shipped or are arriving. Exciting.
Meanwhile some people have mentioned how eye tracking could be a great addition to a VR set, and some have even started hacking their headset. See for example:
So what’s the point? One use is explained in this paper:
Foveated Real-Time Ray Tracing for Virtual Reality Headset. In this excerpt from AMD Developer Summit 2013, the rendering architect of the Frosbite engine, Johan Andersson talks briefly about foveated rendering.
Long story short: Oculus and eye tracking could really mean real-time ray tracing in a matter of years.
Exciting. But I’m repeating myself.
In this 2mn video,
Adam Cole briefly introduces how a flow visualization technique, known as schlieren photography, can be used with high speed camera to film the propagation of sound through air.
Insanely Twisted Shadow Planet is a 2D shooter with an outstanding visual style (and this is where my description ends since I am yet to try this game, even though I already bought it). In this 10mn video, Ryan Meyer explains how the camera system he wrote for the game works.
Following on his previous talks on data visualization and
programming interfaces, Bret Victor presents the idea of what he calls a “seeing space”, meant to improve understanding of problems in the context of collaborative engineering.
Seeing Spaces from Bret Victor on Vimeo.
Posted in Design, Science |
Tagged collaborative, data, debug, development, engineering, optimization, programming, talk, test, visualization |
Here are a couple of links on how to render light scattering effect (aka. volumetric shadows):
Posted in Rendering |
Tagged aliasing, article, Intel, light scattering, light shaft, noise, paper, post processing, SIGGRAPH, talk, temporal upsampling, Unity, volumetric lighting, volumetric shadows |
Oculus shared a list of recommendations for a good VR experience as a PDF, and kept updating them since then: Oculus VR Best Practices Guide.
Tom Forsyth gave a talk at GDC 2014 where he gave some guidelines on what to do, what not to do, and what they haven’t figured yet about making VR experiences. The talk is available in the GDC Vault: Developing VR Experiences with the Oculus Rift.
Michael Abrash of Valve gave a talk about the near future of VR:
What VR could, should and almost certainly will be within two years. Much of it deals with the notion of “presence”, the sensation of actually being in the virtual world, and what makes or breaks it.
I’m a bit late to the party, noticing that
Mikko Kallinen, co-founder of Praxis, wrote a series of articles presenting an overview of the tech involved in the rendering engine he’s writing.
Shading and atmospheric effects in Praxis’ engine
The results are visually impressive, so it’s very unfortunate he doesn’t give more details. This video in particular, showcasing real-time atmospheric effects is outstanding.