Series of articles on noise generation

I discovered last year these tutorials by Jasper Flick on how to make and use noise in Unity, and a couple of terrain and particle use examples. They present the difference between value noise and gradient noise, how Perlin noise and simplex noise work, and among others how to use curl noise to control the flow of particles.

The order information is presented is well thought, although the intention might not be clear at first. Don’t let the beginner’s tutorial tone (“You’ll learn to: create and fill a texture;”, etc.) turn you away, as the series do a great job at detailing the concepts and algorithms in a simple manner yet without cutting corners like so many articles on the topic do (when they’re not blatantly wrong and go ahead calling a blurred noise “Perlin noise”). I thought I had a pretty good grasp of gradient noise already, but reading it gave me an even better understanding.

Pushing particles around with Simplex curl noise.

While at it, other resources on the topic include Ken Perlin’s GDC 1999 talk and his two pages paper Improving noise explaining why use a 5th order polynomial for interpolation (a function I’ve sometimes seen called “smootherStep”).

John Carmack on Oculus at GDC 2015

John Carmack, the CTO of Oculus VR, gave a talk at the Game Developers Conference that just ended this week. Various topics are addressed, including the story behind Samsung’s Gear VR and what’s coming next, the democratization of virtual reality, the work on the API, the unsolved problem of controllers in VR, or the use of real-time ray tracing in VR.

John Carmack’s GDC 2015 talk.

It is a fairly long video (1h30), and as often with him, there are no pictures to see, just hear his personal views and insights on the work he is currently taking care of.

Relativistic and non euclidean space rendering

The Portal series built a full game concept out of non euclidean spaces. Besides being great games, I think it is fascinating how true the tagline “Now you’re thinking with portals” is.

Here are two interesting experiments putting the person in different spaces than we are used to due to real world conditions. This video by Varun Ramesh demonstrates a non-euclidean ray tracer:

This other video by the MIT Game Lab demonstrates OpenRelativity, a Unity toolkit allowing simulation of navigation at relativistic speeds, used for the prototype game A Slower Speed of Light:

Update: Sylvain mentioned in comments that Carl Sagan explains those effects in the following video:

Statistics of the gaming hardware and software

If you need figures or maybe just an idea of what hardware and software the gaming public is using, it’s all out there, gathered and published by the big boys:

The frame debugger in Unity 5

Aras Pranckevičius wrote on the Unity blog about the new frame debugger feature they added to their editor: Frame Debugger in Unity 5.0. The hack, he calls it, is very simple and just consists in interrupting the rendering at a given stage and display whichever frame buffer was active at the moment. Just a couple of days of work; most of the work went into the editor UI.

From the article:

There’s no actual “frame capture” going on; pretty much all we do is stop rendering after some number of draw calls. So if you’re looking at a frame that would have 100 objects rendered and you’re looking at object #10 right now, then we just skip rendering the remaining 90. If at that point in time we happen to be rendering into a render texture, then we display it on the screen.

This means the implementation is simple; getting functionality working was only a few days of work (and then a few weeks iterating on the UI).

Illustration

Illustration from the article: “Here we are stepping through the draw calls of depth texture creation, in glorious animated GIF form”

Volumetric light scattering

Here are a couple of links on how to render light scattering effect (aka. volumetric shadows):

 Update: