Note: This is *really* late in being posted, but I had work to catch up on when I got back 🙂
The last day of SIGGRAPH 2006. I divided my time between papers and sketches, so there was a lot to push into my already overloaded and sleep deprived brain at this stage. But I gave it a college try.
The first paper that I caught was most of the “Streaming Computation of Delaunay Triangulations” paper. The main advantage of this approach was being able to do both the input and output of the triangulation as a stream without large intermediate storage, and a modest memory requirement. The algorithm took advantage of the typical locality of the input data and created a system for figuring out when partial quadtrees of the input data were done. The results of the time and memory taken for doing very large sets of data were very impressive. One thing that stuck in my mind about this was seeing a data set from an aeroplane flying over some area, and the comment from the presenter that the pilot forgot to turn the data collection off. Even with this issue hurting data locality the algorithm handled things well.
The next paper in the session that I caught was “Spectral Surface Quadrangulation”, which presented a technique for taking an input mesh and creating large quad patches that could be further divided to create mostly undistorted quads across the whole mesh. I have read some papers of this kind of thing before, but never in huge detail. The approach here was to create a function that had maxima, minima and saddle points, and to select a function that created these extremes in appropriate places. After this the points are refined with various techniques and quads can be generated.
After the morning break I caught “Simulating Multiple Scattering in Hair Using a Photon-Mapping Approach”. I have never looked into hair rendering before, but from what I gather the scattering is quite directional with hair and hence typical methods suffer from a very small quantity of samples, making even blonde hair look dark. The approach shown in this paper deposits photons uniformly along path lengths, rather than at interaction points. This effectively has photon density mapping directly to radiance, which is what is needed to be calculated. The results are very interesting – especially simulating hair viewed in front of a light.
Following that was “Statistical Acceleration for Animated Global Illumination”, which demonstrated a mechanism for sampling temporally as well as spatially for global illumination. Since the changes in illumination are very small from frame to frame (as well as from pixel to pixel in most cases) and you need to render every frame anyway, this approach gives both better looking results as well as faster rendering. The paper also showed how dramatic changes in the scene geometry or lighting can be handled as well.
Next up was “Multidimensional Lightcuts”, which showed how to extend the lightcuts algorithm to handle effects such as motion blur. Doing effects like this effectively adds another dimension to the lighting equation integral, and this paper showed how to use similar techniques to the original lightcut algorithm to bring the discretized version of the integral into a small enough set of samples to calculate the lighting effectively.
The final paper before lunch was “Direct to Indirect Transfer for Cinematic Relighting”. This showed a system that assumed that the camera was fixed, and then calculated the direct to indirect transfer matrix for the scene at a viewpoint. The transfer matrix is compressed in a GPU friendly way using wavelets, which allows for the scene to have completely different lighting at interactive rates. Even though the viewpoint and geometry are fixed, this still shows a very effective way to relight a scene.
After lunch I decided to round out the SIGGRAPH experience with a couple of sketches. The first sketch session was titled “Fast & Cheap”, which sounded great because I love real-time and interactive stuff, and to do that everything must be fast and cheap. The first presentation was about “Fast Approximation to Spherical Harmonics Rotation”, which presented a fast and simple way to approximate a rotation in spherical harmonic space. At the sketch it was admitted that a previous method is actually about as fast (which was only discovered right before the conference) but the mathematics employed were very interesting regardless. The basic idea was to decompose the transform into smaller, simpler transforms (rotations around Z are cheap and easy) and then to approximate the non-simple transform (apparently Y rotations are very yucky). It is interesting to see here something which I think happens often in graphics – that expanding a simple formula into a big complex one sometimes makes it much faster.
Speaking of complex mathematical constructs, the next presentation in the sketch (Real-time All-Frequency Relighting in Local Frame) used some very interesting constructs in order to achieve their impressive results. BRDFs are stored in the local frame and compressed using a spherical wavelet approach. Since the BRDF is in local frame, effects such as bump mapping can be done more easily. The lighting is also done per-pixel, using a visibility texture for shadowing. As with many other approaches I saw at SIGGRAPH there is a pattern here of very targetted precalculation and mathematical transformation enabling things to be done in real time that would have been unthinkable five or ten years ago (by me, anyway).
After this Sing Bing Kang from MSR presented a technique to render rain in real time using the GPU. After seeing the ATI presentation on a similar topic it was interesting to contrast the approaches. This approach worked by first using some image processing techniques to extract rain from video, and then uses a PRT based rendering approach where raindrops are assumed to be spherical and non-interacting to render the drops. This is a less memory and shader intensive technique than the ATI one.
This concluded the last day for me. The whole conference was a blast, and it exceeded the expectations of what to expect from the years of wondering what it was all about.