Coordinate mapping between sphere, disc and square

This paper, published by Martin Lambers in the Journal of Computer Graphics Techniques, compares different mappings between sphere and disc, and between disc and square. It is worth noting that the source code is available on the publication page.

Mappings between Sphere, Disc, and Square.

Abstract:
A variety of mappings between a sphere and a disc and between a disc and a square, as well as combinations of both, are used in computer graphics applications, resulting in mappings between spheres and squares. Many options exist for each type of mapping; to pick the right methods for a given application requires knowledge about the nature and magnitude of mapping distortions.

This paper provides an overview of forward and inverse mappings between a unit sphere, a unit disc, and a unit square. Quality measurements relevant for computer graphics applications are derived from tools used in the field of map projection, and a comparative analysis of the mapping methods is given.

Series of articles on noise generation

I discovered last year these tutorials by Jasper Flick on how to make and use noise in Unity, and a couple of terrain and particle use examples. They present the difference between value noise and gradient noise, how Perlin noise and simplex noise work, and among others how to use curl noise to control the flow of particles.

The order information is presented is well thought, although the intention might not be clear at first. Don’t let the beginner’s tutorial tone (“You’ll learn to: create and fill a texture;”, etc.) turn you away, as the series do a great job at detailing the concepts and algorithms in a simple manner yet without cutting corners like so many articles on the topic do (when they’re not blatantly wrong and go ahead calling a blurred noise “Perlin noise”). I thought I had a pretty good grasp of gradient noise already, but reading it gave me an even better understanding.

Pushing particles around with Simplex curl noise.

While at it, other resources on the topic include Ken Perlin’s GDC 1999 talk and his two pages paper Improving noise explaining why use a 5th order polynomial for interpolation (a function I’ve sometimes seen called “smootherStep”).

PVRTC2 compression quality

Christophe Riccio posted on his twitter feed some pictures comparing the quality of different texture compression formats, including the PowerVR’s native compression format, PVRTC2. In the light of his tests, it seems to me the new compression is a lot better than before (unfortunately they are not compared).

Last year at my work, in a context of trying to reduce loading time, memory consumption and application size, we gave a try at PVRTC and in our use case it was a clear no go. The quality was so badly impacted that the texture size we’d need for the artists to be happy was well beyond the weight of a PNG of equivalent quality. In the end we settled with WebP.

Here it is interesting to see that even at 2bpp, PVRTC2 seems to retain a lot of detail and texture. The edge tend to be muddy but this is still very good for the price.

Various links on ray tracing

Here are some links related to ray tracing, and more specifically, path tracing.

Some ray tracing related projects or blogs:

Some major publications:

  • The rendering equation, SIGGRAPH 1986, James T. Kajiya. From the paper:

    We present an integral equation which generalizes a variety of known rendering algorithms.
    […]
    We mention that the idea behind the rendering equation is hardly new.
    […]
    However, the form in which we present this equation is well suited for computer graphics, and we believe that this form has not appeared before.

  • Bi-directional path tracing, Compugraphics 1993, Eric P. Lafortune and Yves D. Willems. From the paper:

    The basic idea is that particles are shot at the same time from a selected light source and from the viewing point, in much the same way. All hit points on respective particle paths are then connected using shadow rays and the appropriate contributions are added to the flux of pixel  in question.

  • Optimally Combining Sampling Techniques for Monte Carlo Rendering, SIGGRAPH 1995, Eric Veach and Leonidas J. Guibas. From the abstract:

    We present a powerful alternative for constructing robust Monte Carlo estimators, by combining samples from several distributions in a way that is provably good.

  • Metropolis Light Transport, SIGGRAPH 1997, Eric Veach and Leonidas J. Guibas. From the abstract:

    To render an image, we generate a sequence of light transport paths by randomly mutating a single current path (e.g. adding a new vertex to the path).

  • Robust Monte Carlo methods for light transport simulation, 1998, Erich Veach PhD thesis (432 pages pdf): it presents bidirectional path tracing, and introduces Metropolis Light Transport and Multiple Importance Sampling. From the abstract:

    Our statistical contributions include a new technique called multiple importance sampling, which can greatly increase the robustness of Monte Carlo integration. It uses more than one sampling technique to evaluate an integral, and then combines these samples in a way that is provably close to optimal. This leads to estimators that have low variance for a broad class of integrands. We also describe a new variance reduction technique called efficiency-optimized Russian roulette.

    […]

    The second algorithm we describe is Metropolis light transport, inspired by the Metropolis sampling method from computational physics. Paths are generated by following a random walk through path space, such that the probability density of visiting each path is proportional to the contribution it makes to the ideal image.

Other:

On a slightly different topic, fxguide had a great series of articles on the state of rendering in the film industry, which I previously mentioned.

Reading list on Z-buffer precision

Nathan Reed recently published a blog article plotting his numerical findings of Z-buffer precision under different uses. On the way he references a couple of previous articles, that also reference other resources; I think it’s a good opportunity to list some of them. They all tell a part of the story and I recommend reading all of them to get the complete picture.

Unreal Engine experimental scene videos

Since the beginning of 2014, there has been a lot of videos demonstrating the realism that can now be achieved with Unreal Engine 4.

Often, these videos showcase a static scene or even concentrate on a single detail: lighting in an architectural structure, the look of rain hitting the ground, or some wet pebble on the beach.

Physically based rendering, global illumination and screen space reflections seem to manage to trick the brain an get it confused between what is real and what isn’t. Even when some artifacts get salient, like reflections popping in and out or changing with camera orientation, we are quick to forget them and find the image very believable.

Here are some of these videos, by Alexander Dracott, Koola, and Benoît Dereau.

Unreal 4 Lighting Study: Forest Day from Alexander Dracott on Vimeo.