Quick change list overview script

A part of my daily routine when arriving at the office in the morning is a quick overview of the code changes since the previous day. It’s not a full blown code review, I only rapidly look at the most important parts of the code base, but I make a point of doing it every day. It allows me to stay informed and notice if something is wildly going in the wrong direction.

This is something I do every day, so it has to be quick and simple. I don’t want to have to open three windows, ctrl-select and right click all over the place. Especially since, like I said, it’s the morning, meaning my neurons are still not talking to each other much yet. I need something dead simple.

Here is a short command line shell script I wrote for doing just that: show me the changes since the previous working day. It uses date with some fancy options to figure weekdays, as well as colordiff and less for comfortable reading. Just launch it from the terminal, and there you have all the changes ready to be scrolled. Feel free to use it if you find it useful.

#!/bin/sh

argn=$#
if [ $argn -lt 1 ]
then
    echo "Usage: $0 <paths>"
    exit 1
fi

(
dateRequest="yesterday"
startDate=`date --date="$dateRequest" +"%F"`
startDay=`date --date="$dateRequest" +"%u"`

if [ $startDay -gt 5 ]
then
    dateRequest="last Friday"
    startDate=`date --date="$dateRequest" +"%F"`
    startDay=`date --date="$dateRequest" +"%u"`
fi

echo "Changes since $startDate:";echo;echo

for d in $@
do
    echo "$d"
    svn log -r{"$startDate"}:HEAD "$d"
    svn diff -r{"$startDate"}:HEAD "$d" -x --ignore-space-change | colordiff
done
) | less -R

Using a ti-83 / ti-84 calculator as an intervalometer

In this article on Instructable is explained how to turn a ti calculator into an intervalometer for a DLSR, by simply using its communication cable and writing a little script: Turn a TI Graphing Calculator into an Intervalometer and Create Time Lapse Videos.

The author tested it with a ti-83+ and a Canon Rebel, but ti-84, ti-89 and other cameras are also discussed in the comments. Other combinations can be found all over the web.

Billboards and particle rendering

Geeks3D recently posted a couple of posts on billboards rendering using vertex shader or geometry shader, and particle rendering performance when using point sprites or geometry shader.

Real time stereo ray tracing engineer position in Tokyo

I have retweeted this already, but information tends to get buried pretty quickly on Twitter so I put it here. Syoyo, a real time ray tracing enthusiast, is looking for a talented ray tracing engineer to join his company, Light Transport.

Given their existing technology (interactive to real time ray tracing, interactive shader editing with JIT compilation) and their current focus on the Oculus DK2, I can let you imagine how exciting this position is.

The frame debugger in Unity 5

Aras Pranckevičius wrote on the Unity blog about the new frame debugger feature they added to their editor: Frame Debugger in Unity 5.0. The hack, he calls it, is very simple and just consists in interrupting the rendering at a given stage and display whichever frame buffer was active at the moment. Just a couple of days of work; most of the work went into the editor UI.

From the article:

There’s no actual “frame capture” going on; pretty much all we do is stop rendering after some number of draw calls. So if you’re looking at a frame that would have 100 objects rendered and you’re looking at object #10 right now, then we just skip rendering the remaining 90. If at that point in time we happen to be rendering into a render texture, then we display it on the screen.

This means the implementation is simple; getting functionality working was only a few days of work (and then a few weeks iterating on the UI).

Illustration

Illustration from the article: “Here we are stepping through the draw calls of depth texture creation, in glorious animated GIF form”

Oculus Rift and eye tracking

People on twitter are starting to mention that their Oculus DK2 are being shipped or are arriving. Exciting.

Meanwhile some people have mentioned how eye tracking could be a great addition to a VR set, and some have even started hacking their headset. See for example:

So what’s the point? One use is explained in this paper: Foveated Real-Time Ray Tracing for Virtual Reality Headset. In this excerpt from AMD Developer Summit 2013, the rendering architect of the Frosbite engine, Johan Andersson talks briefly about foveated rendering.

Long story short: Oculus and eye tracking could really mean real-time ray tracing in a matter of years.

Exciting. But I’m repeating myself.