Volumetric Clouds

This project is based on the techniques described by Jerry Tessendorf during his visit to UPenn last November. Clouds are composed of groups of "pyroclastic" puffs whose surfaces are calculated from 3D perlin noise. The clouds are then rendered using a ray marching technique.  Users can create and configure puffs from an interactive editor and then render the final result.

Results

Here are some animations and still images created by my application:

Descriptions, from left to right:

• Two keyframes were used to generate a 10 frame animation loop at high render settings (stepsize=0.5, blocksize=1). Unlike the default settings, the plasma puff's color was a radial gradient between red at the center and yellow at the edges.
• Smoke ring: a circle of puffs expands and dissipates

Descriptions, from left to right, top to bottom:

• Single puff with no lighting, rendered with blocksize 2 and stepsize 0.1
• Single puff with lighting, rendered with blocksize 1 and stepsize 0.01
• 3 white puffs rendered with orange lighting
• Hazy puffs with low density (0.5)

Rendering Algorithm

To calculate the surface shape, I start with a sphere and then use 3D perlin noise to add a displacement for the each point on sphere's surface. The user can set the puff density to be constant or to vary linearly from the sphere center.

To render the puffs, I first discretize the puff into a grid so that I can cache the amount of light penetrating throughout the cloud. I can then do a ray cast where I accumulate the opacity and color along the line of sight between the eyepoint and each pixel. The resulting opacity and color are then blended with the background. The accumulation algorithm looks like:

color = (0,0,0)transmitivity = 1.0pt = eyeposwhile not done: pt += eyeDir * stepsize deltaT = exp { -Kappa * stepsize * densityAtPt} T *= deltaT color += 1/Kappa * (1 - deltaT) * LightColor * colorAtPuff(pt) * T * LightT// where T = transmitivity = 1 - opacity// Kappa = user specified constant// LightColor and colorAtPuff are piecewise combinedpixelColor = color + backgroundColor*T
User Interface

The interface consists of an interactive 3D window and corresponding ray traced render window:

The UI is based on a framework I've made in previous projects. The user can load and save scenes and create and configure puffs. Users control the surface shape of puffs via perlin noise (amplitude (via the persistence parameter), frequency, and number of octaves).

The 3D interactive view gives an approximation of the puff. To render the scene, select "Render" from the render menu.

References

• "Production Volume Rendering" by Jerry Tessendorf
• http://freespace.virgin.net/hugo.elias/models/m_perlin.htm
• http://www.sepcot.com/blog/2006/08/PDN-PerlinNoise2d