Vexal

Photon Mapping

Photon Mapping - Description

Overview

This project implements an algorithm for global illumination called photon mapping. Photton mapping combines with ray tracing to simulate lighting in a 3D scene from both direct light sources and indirect light sources.

Scene Rendered with Photon Mapping

Photon mapping high quality

The Algorithm

Photon mapping works by bouncing hundreds of thousands of photons from each light source around the scene and recording where they land. There are two main steps to this algorithm: generating the photon map, and ray tracing the scene.

Photon Map Generation Step

The photon map's purpose is to store how much light is striking various parts of the 3D environment. To accomplish this, rays (photons) are traced in random directions from each light source and their paths are intersected with the objects in the environment to determine where the light shines. At each intersection, the "photons" are bounced off again in a random direction to simulate light bouncing off objects and lighting other areas of the environment indirectly. For each intersection of a photon and the environment, the location and intensity of the photon is recorded in a data structure.

Ray Tracing Step

During the ray tracing stage, a ray is traced from the eye point through each pixel in the screen. The color of the pixel the ray passes through will be set to the color of the point on the object the ray intersects with in the scene. Reflections are done by combining multiple ray bounces. To determine the lighting of the point, a ray is traced from the intersection point to each light source to determine if the point is shadowed. However, even in the case of a point being shadowed by another object, some light should still indirectly strike the point. This is compensated for by looking up the number of photons which have struck near the point, and using this information to compute the indirect illumination at the point.

Scene Rendered Without Photon Mapping

Ray traced image
In this image rendered with the ray tracing step only, only areas directly struck by the light source are illuminated; corners and backfaces are completely black.



The Application

Screen Shot of the UI

Photon mapping UI FLTK
The UI uses FLTK.

The UI allows the user to specify several parameters for the ray tracing stage and photon mapping stage. The user is able to select the number of threads that the ray tracer will use to trace multiple rays simultaneously on processors with multiple cores.

The user is able to dynamically create additional objects and add them to the scene. The ray tracer and photon mapping also support a real-time simulation mode. Objects in the scene are added to a rigid-body physics simulation allowing them to bounce and collide in 3D. The ray tracer updates in real-time. I use the term "real-time" loosely. Activating the simulation also turns the camera into a first-person camera controllable with the mouse and keyboard.

Quantum Rendering

The application also supports Quantum Rendering optimization. Quantum Rendering is much faster but slightly less accurate. Instead of tracing a ray through each pixel, it instead randomly chooses the color for each pixel on the screen. This takes advantage of the multi-universe theory because a separate universe is generated for each possible outcome of the color of each pixel. In at least one generated universe, the randomly-generated image should be correct.

Quantum Rendering

Quantum rendering desk.
A rendering of a desk and chair in our universe. Notice how the image accuracy is slightly less than that of conventional ray-tracing.

Further Quantum Rendering Optimizations

The Quantum Rendering algorithm can be further optimized. In the naive implementation, the color of each pixel is generated randomly, in turn creating a new universe for every single pixel on the screen. This is a waste of universes. Instead what we can do is generate a single seed value randomly, and then use this seed value as a parameter for a fixed function to generate the color of each pixel. This method only creates one universe per frame.

Quantum Rendering Cavaets

In theory, new universes are only generated in real-life for each outcome of a truly random event. Since random number generators in computers are psuedo-random, it is likely that their use will not cause the creation of new universes. To fix this, we use a USB geiger counter to seed the number generator in the program, and place a block of uranium next to the computer when running the renderer. This will cause truly random events because of physics and science.

2011 - Matt Swarthout and James Lee

Photon Mapping Images