Reza Ali
Media Art & Technology
594CM -
Fundamentals of Spatial Computing
FInal Project: Light Storm
Light Storm is a project that explores 3D space, particle systems, flocking, spheres shaders, light scattering, interaction, and electromagnetic forces. The original idea was to develop a particle system that would flock like birds on the surface of a sphere, and then to have the surface of that sphere broken up into surface patches via a 3D surface voronoi algorithm. This was documented in my research presentation for MAT 594CM. As the project grew, I used a sculptor’s methodology to determine what the end aesthetic of the piece should be. I added different element and relationships (particles, particle-particle awareness, spherical flocking, light scattering, glsl shaders, simple circles, squares, lines, dots, bezier curves, osc interaction and audio) that produced the end visual aesthetic. Light Storm is an interactive audio-visual piece that allows viewers to manipulate the visual system by a mobile device that is equipped with wifi and osc capabilities. The interactivity element allows viewers to manipulate system parameters, however this does not mean the system relies on user interaction, it can sustain itself and run forever if left alone. Please view the video below; this demonstrates the piece being manipulated while music (13 & God - "afterclap") was played.
The sunburst effect (also know as: crepuscular rays, sunbeams, light scattering, star flares, god rays or light shafts) in Light Storm was implemented with Frame Buffer Objects and a pixel shader in OpenGL using the process Kenny Mitchell describes in GPU GEMS 3.

Kenny Mitchell writes in GPU Gems 3, “Under the right conditions, when a space contains a sufficiently dense mixture of light scattering media such as gas molecules and aerosols, light occluding objects will cast volumes of shadow and appear to create rays of light radiating from the light source.” This complex natural phenomenon can be imitated in 3D virtual space using a simple post process method that mimics the effect of volumetric light scattering due to shadows in the atmosphere. A detailed description of the effect and its implementation can be found here.
A high level description of the 2D post processing effect is as follows; first create a frame buffer object (FBO) and render the light source and the objects (occluded) in the scene (to save on computation, you can disable depth testing and texturing and size down the resolution of the render), then you clear the depth buffer and render the scene normally, finally switch the project mode to Orthogonal projection and blend the FBO with the current frame buffer, the shader is now activated to generate the light scattering effect. This approach differs from other methods of light scattering because it does not require preprocessing and/or modification of the scene, instead it uses a per-pixel processing operation that allows for detailed light shafts in scene of arbitrary complexity (Mitchell)

The per-pixel processing is accomplished by a vertex shader (GLSL = GL Shading Language); shaders allow users to reprogram the graphics pipeline and program the graphics processing unit directly, this allows for very complex shading models to be applied to objects at a very high speed, this allowing for effects that are very computationally intensive to run on the GPU to save CPU power for other operations such as flocking computation in my case. The shader used here was taken from here. I spent some time figuring out how it implements light scattering, here is my overview of light scattering:

The analytic model equation of daylight scattering is:

  • s is the distance traveled through the media
  • θ is the angle between the ray and the sun
  • Esun is the source illumination from the sun
  • βex is the extinction constant composed of light absorption and out-scattering properties
  • βsc = is the angular scattering term composed of Rayleigh and Mie scattering properties

The first term calculates the amount of light absorbed from the point of emission to the viewpoint. The second term calculates the additive amount due to light scattering into the path of the view ray. The effect due to occluding matter such as clouds, buildings and other objects is modeled here simply as an attenuation of the source illumination (Mitchell):

D(φ) is the combined attenuated sun-occluding objects’ opacity for the view location φ. This attenuation consideration complicates the calculation, because now the occlusion of the light source for every point in the image needs to be determined. However in screen space we don’t have volumetric information to determine the occlusion. A way to overcome this is to estimate the probability of occlusion at each pixel by summing samples along a ray to the light source in image space. The proportions of samples that hit the emissive region versus those that strike occluders gives us the desired percentage of occlusion (Mitchell). This basically compares the samples along the ray that hit the light source and the samples that hit occluding surfaces:

This equation yields an additive sampling of the image by dividing the sample illumination by the number of samples, n. This basically means if a sample is not occluded the illumination will be positive and add to the illumination as opposed to if it were occluded, in which case the sample would contribute nothing to the total illumination. Moreover the shader uses attenuation coefficients to parameterize control of the summation:

  • Exposure controls the overall intensity of the post-process
  • weight controls the intensity of each sample
  • decay (0->1) dissipates each sample’s contribution as the ray progresses away from the light source

The exponential nature of the decay factor allows each light shaft to fall off smoothly away from the light source. Additionally, the shader has a density variable, which is used to control the separation between samples. After looking at this equation and looking at the shader code, I was able to break down the logic of the shader. The vertex shader is basic and does the minimum required to work (multiplies the position of a vertex by the concatenation of the modelview and projection matrices). The fragment (pixel) shader implements the light scattering formula above and allows for the program to change the parameters of the equation mentioned above to fine tune the shader's effect. The shader code can be viewed here. I did not go into very specifics of the code of how the frame buffer objects work to use the shader here because it is very technical and lengthy. Instead if you are curious download the source code for the project below and look specifically at the render function and the fragment shader; I have commented the opengl code that handles the off screen drawing and blending.

Particle Storm uses a particle system class that I wrote to handle flocking particles on the surface of a sphere. My original intent was to create a particle system that would be free to move about 3D space but would be attracted to the surface of a sphere, so a 3D surface voronoi could be used to subdivide the sphere's surface into voronoi patches using the particles' locations as sites for the voronoi. After implementing the particle class and I realized that particle system was more interesting that the voronoi subdivision of the sphere's surface, which would end up looking like a delaunay triangulation of a mesh. I started playing with electromagnetic forces and ended up creating an effect the resembles distributed charges on the surface of a conducting surface. These charges (particles) spread themselves out such that them minimized the energy of the system. Hence, I created particles with electromagnetic characteristics, see the particle class code for intricacies of the attraction/repulsion system. Moreover, I realized that for the light scattering rays to be constantly changing the particle system would have to be more dynamic, therefore I gave the particles bird like characteristics so that the particles would flock like birds. I used a hybrid of Reynold's flocking algorithm to make the particles' attracted to their center of mass and repel each other once they have come within range of hitting each other. The photos above show the static particle system once it had distributed itself evenly over a sphere's surface. Lastly, I created different ways of visualizing the particles, thus allows the light scattering to be even more complex. In the end the particle system was used to generate geometry that was used to occlude light for the light scattering effect.
Another aspect of the project was to make the visuals interactive. Thus with the help of Wes Smith and Lance Putnam, I was able to quickly integrate OSC communication into Light Storm with their example code for osc communication. The osc library I researched and used for the project is called oscpack and I used an application called TouchOSC on a iPod Touch to communicate to the system. OSC communication allows for multiple users to manipulate/drive the piece simulaneously. Please click on the photos above to see what each button of the interface was mapped to in the visual system. Moreover, I used rtaudio (C++ audio library) to play back some simple sound when particles were added to the system. With a dynamic system like the one where I feel synthicized audio would be the best choice for this system; as the system changes the audio changes/evolves in realtime based on the current state of the system. This system could have infinite states if the particles' positions were used to drive the audio synthisizer. This project is ongoing and will eventually evolve into something else, this is its current state right now (June 2009). The source code for the project is posted below.
References:

Frame Buffer Object (Song Ho Ahn)
Light Scattering Game Engine Source Code (Sanglard)
GPU GEMS 3 (Mitchell)
Frame Buffer Object (Gamedev)
Frame Buffer Object (OpenGL)
TouchOsc
LuaAV/Muro

Oscpack
Lance Putnam
Wes Smith
Angus Forbes
OpenGL Programming
Source Code:

Pixel Fragment Shader

Xcode Project