In this article on Instructable is explained how to turn a ti calculator into an intervalometer for a DLSR, by simply using its communication cable and writing a little script: Turn a TI Graphing Calculator into an Intervalometer and Create Time Lapse Videos.
The author tested it with a ti-83+ and a Canon Rebel, but ti-84, ti-89 and other cameras are also discussed in the comments. Other combinations can be found all over the web.
Geeks3D recently posted a couple of posts on billboards rendering using vertex shader or geometry shader, and particle rendering performance when using point sprites or geometry shader.
This 8mn video shows the initial 30s after ignition at the launch of the Saturn V rocket, carrying the Apollo 11 mission on July 16th, 1969.
Apollo 11 Saturn V Launch (HD) Camera E-8 from Spacecraft Films on Vimeo.
I have retweeted this already, but information tends to get buried pretty quickly on Twitter so I put it here. Syoyo, a real time ray tracing enthusiast, is looking for a talented ray tracing engineer to join his company, Light Transport.
Given their existing technology (interactive to real time ray tracing, interactive shader editing with JIT compilation) and their current focus on the Oculus DK2, I can let you imagine how exciting this position is.
Aras Pranckevičius wrote on the Unity blog about the new frame debugger feature they added to their editor: Frame Debugger in Unity 5.0. The hack, he calls it, is very simple and just consists in interrupting the rendering at a given stage and display whichever frame buffer was active at the moment. Just a couple of days of work; most of the work went into the editor UI.
From the article:
There’s no actual “frame capture” going on; pretty much all we do is stop rendering after some number of draw calls. So if you’re looking at a frame that would have 100 objects rendered and you’re looking at object #10 right now, then we just skip rendering the remaining 90. If at that point in time we happen to be rendering into a render texture, then we display it on the screen.
This means the implementation is simple; getting functionality working was only a few days of work (and then a few weeks iterating on the UI).
Illustration from the article: “Here we are stepping through the draw calls of depth texture creation, in glorious animated GIF form”
People on twitter are starting to mention that their Oculus DK2 are being shipped or are arriving. Exciting.
Meanwhile some people have mentioned how eye tracking could be a great addition to a VR set, and some have even started hacking their headset. See for example:
So what’s the point? One use is explained in this paper: Foveated Real-Time Ray Tracing for Virtual Reality Headset. In this excerpt from AMD Developer Summit 2013, the rendering architect of the Frosbite engine, Johan Andersson talks briefly about foveated rendering.
Long story short: Oculus and eye tracking could really mean real-time ray tracing in a matter of years.
Exciting. But I’m repeating myself.
In this 2mn video, Adam Cole briefly introduces how a flow visualization technique, known as schlieren photography, can be used with high speed camera to film the propagation of sound through air.