Physically-Based Real-Time Diffraction Using Spherical Harmonics

13
Physically-Based Real-Time Diffraction Using Spherical Harmonics Clifford Lindsay and Emmanuel Agu Computer Science Department, Worcester Polytechnic Institute 100 Institute Road, Worcester, MA 01609-2280, USA [email protected], [email protected] Abstract. Diffraction, interference, dispersive refraction and scatter- ing are four wavelength-dependent mechanisms that produce iridescent colors. Wavelength-dependent functions need to be sampled at discrete wavelengths in the visible spectrum, which increases the computational intensity of rendering iridescence. Furthermore, diffraction requires care- ful sampling since its response function varies at a higher frequency variation with sharper peaks than interference or dispersive refraction. Consequently, rendering physically accurate diffraction has previously either been approximated using simplified color curves, or been limited to offline rendering techniques such as ray tracing. We propose a tech- nique for real-time rendering of physically accurate diffraction on pro- grammable hardware. Our technique adaptively samples the diffraction BRDF and precomputes it to Spherical Harmonic (SH) basis that pre- serves the peak intensity of the reflected light. While previous work on diffraction used low dynamic range lights, we preserve the full dynamic range of the incident illumination and the diffractive response over the entire hemisphere of incoming light directions. We defer conversion from a wavelength representation to a tone mapped RGB triplet until display. 1 Introduction Real-time iridescence: Iridescent surfaces exhibit different colors (shimmer) as the view or incident light angles change. Diffraction (CD Roms), interference (oil slicks), dispersive refraction (glass prism) and scattering (rainbow) are four distinct optics mechanisms that cause iridescent colors. Iridescent colors are es- pecially impressive when objects are moved interactively to fully demonstrate its angle-dependent nature. For instance, rotating a CD-ROM in real time, walking past an oil slick in a virtual world, or observing moving butterfly wings in real- time all give stunning visual effects. Consequently, the rendering of iridescent materials at interactive rates is attractive, but challenging. Issues With Rendering Diffraction: Physically-accurate Bi-Directional Re- flectance Functions (BRDFs) generate more photorealistic images since they capture the intricate light-surface interactions of the underlying phenomena. However, rendering physically-accurate wavelength-dependent iridescent BRDFs is complex because they need to be sampled over at many discrete wavelengths G. Bebis et al. (Eds.): ISVC 2006, LNCS 4291, pp. 505–517, 2006. c Springer-Verlag Berlin Heidelberg 2006

Transcript of Physically-Based Real-Time Diffraction Using Spherical Harmonics

Physically-Based Real-Time Diffraction Using

Spherical Harmonics

Clifford Lindsay and Emmanuel Agu

Computer Science Department, Worcester Polytechnic Institute100 Institute Road, Worcester, MA 01609-2280, USA

[email protected], [email protected]

Abstract. Diffraction, interference, dispersive refraction and scatter-ing are four wavelength-dependent mechanisms that produce iridescentcolors. Wavelength-dependent functions need to be sampled at discretewavelengths in the visible spectrum, which increases the computationalintensity of rendering iridescence. Furthermore, diffraction requires care-ful sampling since its response function varies at a higher frequencyvariation with sharper peaks than interference or dispersive refraction.Consequently, rendering physically accurate diffraction has previouslyeither been approximated using simplified color curves, or been limitedto offline rendering techniques such as ray tracing. We propose a tech-nique for real-time rendering of physically accurate diffraction on pro-grammable hardware. Our technique adaptively samples the diffractionBRDF and precomputes it to Spherical Harmonic (SH) basis that pre-serves the peak intensity of the reflected light. While previous work ondiffraction used low dynamic range lights, we preserve the full dynamicrange of the incident illumination and the diffractive response over theentire hemisphere of incoming light directions. We defer conversion froma wavelength representation to a tone mapped RGB triplet until display.

1 Introduction

Real-time iridescence: Iridescent surfaces exhibit different colors (shimmer) asthe view or incident light angles change. Diffraction (CD Roms), interference(oil slicks), dispersive refraction (glass prism) and scattering (rainbow) are fourdistinct optics mechanisms that cause iridescent colors. Iridescent colors are es-pecially impressive when objects are moved interactively to fully demonstrate itsangle-dependent nature. For instance, rotating a CD-ROM in real time, walkingpast an oil slick in a virtual world, or observing moving butterfly wings in real-time all give stunning visual effects. Consequently, the rendering of iridescentmaterials at interactive rates is attractive, but challenging.

Issues With Rendering Diffraction: Physically-accurate Bi-Directional Re-flectance Functions (BRDFs) generate more photorealistic images since theycapture the intricate light-surface interactions of the underlying phenomena.However, rendering physically-accurate wavelength-dependent iridescent BRDFsis complex because they need to be sampled over at many discrete wavelengths

G. Bebis et al. (Eds.): ISVC 2006, LNCS 4291, pp. 505–517, 2006.c© Springer-Verlag Berlin Heidelberg 2006

506 C. Lindsay and E. Agu

within the human visual range (380 - 700nm) [1]. Additionally, since the diffrac-tion BRDF varies at a much higher frequency than interference, scattering anddispersive refraction, more samples are required to capture its full response. Dueto these computational challenges, previous work has either rendered physically-based diffraction in offline renderers or used less computationally-intense ap-proximate models for real-time rendering of the diffraction BRDF [13].

Real-time Physically-based Diffraction: In this paper, we demonstrate a techniqueto render physically-accurate diffraction at interactive rates. Our approach sam-ples the high-frequency diffraction BRDF intelligently in order to preserve thepeak value intensities for each maxima. We factorize both the diffraction BRDFand incident light using a low-order Spherical Harmonics (SH) basis such thatevaluating the BRDF response function is reduced to a dot product at eachsampled wavelength. Although Spherical Harmonics (SH) has traditionally beenused for low frequency lighting, we are able to use SH for diffraction because oursampling approach aggressively reduces the required sampling frequency. Evalu-ating the diffraction surface responses at many wavelengths requires significantlymore storage than traditional trichromatic (RGB) representations [15]. We re-duce storage requirements on the GPU by using the Composite Model (CM) [14]for compactly storing diffraction reflections.

Our technique also permits the use of a full hemisphere of incident light as op-posed to previous work [1,2,16] that was limited to point or area light sources. Itis important to note that since diffractive response essentially separates incidentwhite light into its wavelength (color) components, the use of a full hemisphereof light generates more realistic images.

The rest of the paper is as follows. Section 2 presents related work, section 3gives some background that is necessary to understand our technique, section 4describes our technique, section 5 presents our rendering results and section 6concludes and describes future work.

2 Related Work

Diffraction: The majority of work on rendering physically accurate diffraction isdesigned for use in ray tracers and other offline renderers. Stroke [5] derived thegeometric conditions for iridescence. Thorman [4] developed a computer graphicsmodel for diffraction based on the grating equation. Thorman [4] applied Stroke’sgrating equation to derive an illumination model for diffraction and resolvesspecific issues with rendering iridescent colors. Nakamae et al [6] developed atechnique for rendering natural objects including diffraction with high intensitylight sources.

Agu [2], Stam [1] and Sun [16] have more recently developed shading modelsfor rendering diffraction. Stam uses Fourier optics to develop an anisotropicreflection BRDF based on Kirchoff’s approximation for light scattering. Agudeveloped a ray optics BRDF for rendering diffraction, which uniquely computesthe peak intensity at the maxima for each viewing angle. Sun’s model is basedon wave optics and was applied specifically to rendering optical disks. All three

Physically-Based Real-Time Diffraction Using Spherical Harmonics 507

diffraction models generate photorealistic images and can be rendered in real-time using today’s graphics hardware provided the lighting environment issimple enough. However, as more complex lighting is used, achievable framerates drops quickly.

Spherical Harmonics: Recently, real-time rendering using Spherical Harmonics(SH) has focused on environment irradiance, BRDF factorization, and radiancetransfer (such as inter-reflections, color bleeding, and caustics). Ramamoorthiand Hanrahan [10] developed an efficient irradiance environment map usingSpherical harmonics that rendered diffuse surface reflection from an environ-ment map. Westin, et al [12] developed a physically-based BRDF representationin terms of matrices of Spherical Harmonic coefficients. Ramamoorthi and Han-rahan improved on their previous model [9] to account for complex BRDFs usinga Frequency Space analysis that represented BRDF as Spherical Harmonic coeffi-cients. Sloan simultaneously developed a technique [8] for precomputing radiancetransfer, which uses Spherical Harmonics for the lighting environment as well asfor the diffuse BRDF, self-shadowing, and inter-reflections. Kautz et al [11] im-proved upon Sloan by generalizing the precomputation of radiance transfer forrendering arbitrary BRDFs using SH.

Lindsay and Agu [7] developed a technique that used a full spectrum wave-length color representation in conjunction with SH to develop a wavelength-dependent BRDF model for rendering iridescence. This work is most relatedto our work. However [7] renders interference not diffraction. The interferenceBRDF varies at a much slower rate than diffraction (extremely spiky with sharppeaks [14]) and requires less sampling. With the exception of [7], all SH worklisted above use a trichromatic color representation for rendering reflections whileour technique requires a wavelength representation.

3 Background

3.1 Physically-Based Diffraction BRDF

Stam [1], Agu [2] and Sun [16] have recently derived three alternative physicallyaccurate BRDFs for diffraction. We felt that Agu’s BRDF gave us adequatecontrol of variables during SH factorization. Hence we utilized the Agu modelfor our technique. However, although we employ Agu’s model, we emphasize thatany of the three physically-based models can be used with our technique.

Agu’s model [2] derives a closed form BRDF by applying Huygens-Fresnelprinciple for Fraunhofer Diffraction. Its main results are now summarized. In-cident white light bounces off the CD-ROM grooves producing planar waveswith uniform intensities. Typically, thousands of grooves occur per square inchon a CD-ROM with separations measuring about a wavelength of visible light.The Agu model sums the planar waves emanating from N grooves to produce aclosed form expression. This closed form expression can easily be evaluated todetermine output diffraction intensities for any pair of incident light and viewangles.

508 C. Lindsay and E. Agu

Fig. 1. Diagram of a measurement of similar locations in a single diffraction pit andgroove. This diagram shows the modes (dots locating points where the light waves addtogether) and the measurement of d = a+b from eq. 3.

Figure 1 shows one of the N diffraction grooves. Each groove has a width ofdimension b followed by a gap of a distance a. Thus, d = a + b is the distancebetween similar points on adjacent grooves. The incident light makes an angle θi

with the normal vector of the CD-ROM and after diffracting off the CD-ROM,θ is the outgoing light angle. Equation 1 is the final expression for diffractionderived in [2]. For the complete derivation, we refer the reader to [2].

f(Ω, λ) = Ambient + diffuse +∑

Ω

λ

I0Ωλ1

N2

(sin β

β

)2 (sin2 Nα

sin2 α

)(1)

Where β = πb(sin θ − sinθi)

λand α = π(a + b)

(sin θ − sin θi)λ

(2)

Since α =πd

λ(sin θ − sin θi) then d(sin θ − sinθi) = mλ (3)

where m = 0,±1,±2... are diffraction modes. The ambient and diffuse termsof equation 1 are the same as for the Phong illumination model. Equation 3 iswidely known as the grating equation and gives the locations of maxima. Figure2(a) is a plot of equation 1.

The Nature of the Diffraction BRDF: We shall now highlight some key char-acteristics of the diffraction BRDF that makes it more challenging to renderin real time than other iridescent phenomena. Since our general approach is tosample and factorize a physically accurate BRDF using SH, our main concern inrendering diffraction lies with the fast-varying nature of the diffraction BRDFfunction.

Figure 2(a) is a wavelength plot of the BRDF function in equation 1. In ourwork, a typical value of N = 1500 was used. The

(sin2 Nαsin2 α

)term causes the

Physically-Based Real-Time Diffraction Using Spherical Harmonics 509

−pi −pi/2 0 pi/2 pi0

2000

4000

6000

8000

10000

sin(θ) − sin(θi)

Inte

nsi

ty W

m−2

(a) Diffraction plot

0 pi/8 pi/4 3pi/8 pi/20

1

2

3

4

5

sin2((2π/λ)wη cos(θt))

Inte

nsi

ty W

m−2

(b) Interference plot

Fig. 2. These two diagrams show the relative intensity plots of the diffraction and inter-ference. The lines showing delta functions for the diffraction plot have been exaggeratedfor demonstration and would normally be on the order nanometers or 1/100000th ofa degree. From this diagram you can see that sampling of interference is less complexcompared to diffraction.

function to vary very fast with extremely sharp narrow spikes. Sampling thesepeaks is important since they contain a large fraction of the BRDF response.These peaks are also contained within very narrow angular ranges such thatangle changes on the order of 1/100000th of a degree (for a typical CD-ROM)can actually miss the location of the peak and result in erroneous intensity values.Capturing these peaks is non-trivial since for N = 1500, the diffraction curvecan go from a minimum (0 response) to a maximum with a view angle changeof 10−5 degrees.

However, not all wavelength-dependent phenomena exhibit these fast varyingBRDFs. As a basis for comparison, we also plot the response of thin-film inter-ference (soap bubbles) in fig 2(b). This slower varying interference function isnot as challenging to sample as diffraction and was rendered using SH in [7]. Insummary, accurate sampling, representation in terms of a wavelength basis andthe efficient storage of samples are core issues with rendering diffraction.

Adaptive sampling the Agu Diffraction BRDF: Our goal is to determine theresponse intensity values for a given pair of incident light θi and view anglesθ. To reduce the number of wasted samples, we invert and solve equation 3 todetermine if there exists a diffraction peak for any mode m and wavelength λ.The modes, m can take integer values from -2 to +2 and λ can take values inthe visible spectrum [380nm - 700nm]. If a valid combination of m and λ yieldsa peak in the visible spectrum, we then plug their values into equation 1 todetermine the intensity of the peak. Sampling only areas which have maxima,allows us to capture the narrow diffraction peaks using relatively few samples.Without using this sampling technique, the diffraction response would vary toofast for encoding using SH.

510 C. Lindsay and E. Agu

FP

S

100+

# of Lights

FullHemisphere

50

25

02 8 32 64

Fig. 3. Effect adding lights when directly evaluating diffraction BRDF. The horizontalaxis shows the correlation between light complexity and frames per second (FPS).Evaluating a full hemisphere of light contribution results in negative FPS or requiringseveral seconds per frame to render.

Finally, it is important to compare rendering the physically-based diffractionBRDF directly versus our technique. A limitation of directly rendering the BRDFis that the frame rate is highly correlated with the number of lights used in theevaluation. Having a scene bound by the number of light is undesirable for real-time rendering and limits the light complexity. Figure 3 is a histogram showingthe results of FPS versus light complexity. The left side of the axis shows asingle light source and on the right a full hemisphere of light contributions.The frame rate for direct rendering drops dramatically with the number of lightwith a sampling rate of fifteen thousand stratified samples. As we mentionedpreviously, our technique does not suffer from the limitations of light complexityand runs at a constant frame rate due to our encoding the lighting in terms ofSH coefficients.

3.2 Spherical Harmonics

Spherical Harmonics (SH) are a set of basis functions that can approximate afunction in two dimensions defined over a sphere. In computer graphics and im-age synthesis, a 2D signal can be approximated in spherical coordinates. In ourcase, we approximate smooth distant lighting and diffraction BRDFs as a setof low-frequency weights for the real values SH basis. In general diffraction isconsidered a high-frequency reflection. As we describe in section 3.1 our sam-pling technique allows for representing the diffraction as a medium-frequencysignal. This affords us the opportunity to approximate the diffraction with SH.SH produces a finite or band-limited approximation of a function by truncat-ing the infinite sum of an integrable function to a countable set of coefficients.Equation 4 is a common form for real-valued Spherical Harmonics used in com-puter graphics. For a more complete description of SH, we refer the readerto [10, 11].

Physically-Based Real-Time Diffraction Using Spherical Harmonics 511

yml (θ, φ) =

⎧⎨

√2Km

l cos(mφ)Pml (cos θ), when m > 0,√

2Kml sin(−mφ)P−m

l (cos θ), when m < 0,K0

l P 0l (cos θ), when m = 0

(4)

3.3 Composite Model

An accurate, efficient and compact color model is needed to represent colorfor real-time diffraction rendering such as the Composite Model (CM) [14]. CMsimply states that any wavelength-dependent function S(λ) (e.g. diffraction) canbe decomposed into a smooth and spiky. Since our diffraction function generatesa series of spikes, we found it useful to store them using the spiky part of theCM. We do not use the smooth part of the CM. Each delta function is storedas its intensity value along with their corresponding wavelength location in thevisible spectrum, resulting in a compact representation.

The behavior of reflections caused by diffraction result in intensity spikes in avery narrow region on the visual spectrum [3]. Minima and non-maximal pointscontribute almost nothing to the final output. The justification for employingThe CM for diffraction color representation is that if we ignore the smoothdecomposition for the spectral function because of its insignificance to the overallreflection, we can represent diffraction as a series of delta functions (about 4 perwavelength [3]). Each maximum of our diffraction calculation will result in asingle delta function or spike. Storing the necessary information to represent thedelta function consists of an intensity value and a single location on the visiblespectrum. The storage requirement for this spectral representation is constantand therefore extremely compact.

4 Our Technique for Real-Time Diffraction

4.1 Overview

In this section, we describe our technique for real-time rendering of physicallybased diffraction using SH. Our technique maintains an accurate representationof physically based diffraction without sacrificing efficiency. To accomplish this,we separate the rendering of diffraction into three phases: a precomputation,rotation and upload, and rendering. A visual depiction of our pipeline is outlinedin Figure 4.

Precomputation begins with an offline process that samples and projects boththe lighting input and diffraction simulation. After projection, the resulting SHcoefficients are stored on the CPU, which is signified by the arrows spanningthe precomputation stage and CPU stage in Figure 4. The lighting coefficientsare then rotated to the tangent frame on the CPU, and uploaded to the GPU.The texture map of SH coefficients is also uploaded to texture memory on GPU.In a vertex shader, the view vector is transformed to tangent space and usedas texture coordinates to lookup the SH texture map. The dot product of thetexture map diffraction coefficients and the rotated lighting coefficients result in

512 C. Lindsay and E. Agu

the final wavelength, which is converted to an RGB triplet and uploaded to theframebuffer.

In the following sections, we describe in more detail our precomputation andrendering steps for rendering diffraction. As mentioned in earlier sections, forclarity of exposition, we focus on one specific diffraction BRDF, but our methodis general enough to be applied to other diffraction BRDFs.

4.2 BRDF and Lighting Precomputation

The primary goal of the precomputation stage is to generate a hemisphere oflight around a point on the diffraction surface and enumerate the diffraction re-sponses for all valid incoming light and outgoing view directions. Separating the

Fig. 4. Block diagram outlining the complete rendering pipeline for our technique.The pipeline is divided into three distinct stages: preprocessing, rotation and transfer,and rendering. The first stage is responsible for sampling and projection of lighting andBRDF to an SH basis. The second stage rotates the lighting coefficients and upload bothlighting and BRDF coefficients to the rendering context. The third stage is responsiblefor the final output.

BRDF from the lighting until light integral calculation at run-time allows us todynamically change lighting values without having to run the computationallyexpensive BRDF simulation. By varying the viewing direction, the SH approxi-mation of the diffraction BRDF can be tabulated by view direction in a texture.A small set of SH coefficients are calculated per view. In many cases, at most 25SH coefficients per view direction are required for most arbitrary BRDFs [11].The size of the texture depends on parameterization of the view vector, spheri-cal samples on s, and the desired accuracy of approximation (coefficients/view).Equation 5 outlines the projection of an arbitrary BRDF and the resultant setof coefficients ci where f(s) can be an arbitrary diffraction BRDF and ym

l (s) isa general form of the SH basis defined in section 3.2.

ci = f̃(s) =∫

s

f(s)yml (s)ds (5)

Precomputing the lighting coefficients from different light sources L(s) hasbeen outlined in several previous papers [8, 11]. We improve on previous tech-niques for precomputing lighting by using the CM for representing the spec-tral power distribution instead of RGB triplets. The precomputed vector of

Physically-Based Real-Time Diffraction Using Spherical Harmonics 513

coefficients li representing the lighting environment can also be used for thenon-diffraction part of the BRDF. The precomputation is done by projecting acaptured or simulated lighting environment into the SH basis (see equation 6).In our case, we project an image-based lighting environment that was gener-ated from a series of High Dynamic Range photos (light probe, courtesy of PaulDebevec).

li = L̃(s) =∫

s

L(s)yml (s)ds (6)

4.3 Rendering

The following section presents a detailed description of the steps to utilize theprecomputed lighting and BRDF coefficients generated in section 4.2, for real-time rendering. The outlined rendering steps can be performed on a CPU orGPU, and may be used in both hardware and software renderers includingshader-capable GPUs. One advantage of our technique is that the renderingphase uses only standard graphics operations, such as texture mapping, trans-formations, and simple mathematical operations. In fact, the presented techniquecan be seamlessly integrated with other interactive rendering techniques to im-prove overall photorealism. Figure 5 is our algorithm for rendering diffraction.

In step 1, for efficiency, the lighting is transformed once per vertex instead ofper fragment. For a thorough explanation of rotating Spherical Harmonic coeffi-cients refer to the appendix in [11]. Steps 2,3 are standard shader programmingtechniques for transforming and rotating vectors. Step 3 refers to a local tan-gent frame in which the view vector needs to be transformed and rotated. Thesurface normal and two orthogonal vectors in the texture direction define thetangent frame. For efficiency, the tangent space basis is pre-calculated offlineand uploaded to the vertex shader at runtime. Once the view vector has beentransformed/rotated, it is used as texture coordinates to look up SH diffractioncoefficients.

The texture map of SH coefficients comprise the precomputation of the diffrac-tion response to arbitrary distant lighting. A listing of view-dependent diffraction

Algorithm. Diffraction Rendering Steps

1. Rotate lighting coefficients, li, into appropriate frame on the host CPU.2. Upload the lighting coefficients to the shader.3. Transform and rotate the view vector to the tangent frame.4. Look up the diffraction coefficients ci in a texture,5. Scale and bias diffraction coefficients to the (0 ⇒ 1 to -1 ⇒ 1) range.6. Apply the dot product of the coefficients.7. Convert from wavelength to RGB, then upload to the framebuffer.8. Tone map the framebuffer from high dynamic range to a displayable range.

Fig. 5. Step-by-Step algorithm for rendering diffraction

514 C. Lindsay and E. Agu

responses encoded into SH coefficients is stored in an N × V texture , where Nis the number of SH coefficients used (we used 25), and V represents all poten-tial view directions. Each texel in the texture map represents a portion of theSpectral Power Distribution (SPD) that will eventually be converted to RGBvalues. The SPD encoding method dictates the number of textures required. Forexample, a CIE color representation may use only one texture for the X, Y, Zvalues. For Agu’s diffraction model, the response in a given view direction isessentially a delta function at a certain wavelength. The resulting texture maputilizes only two values (wavelength and intensity) per texel.

The dot product of the SH coefficients approximates the lighting integral.Evaluating this integral on the GPU when available, further enhances renderingspeed. The computed Spherical Harmonic approximation of lighting and diffrac-tion have unbounded dynamic range. Therefore, the reflection calculations resultin color values outside the displayable range and need to be tone-mapped to therange of the target viewing system. Figure 5 is the necessary steps for the ren-dering algorithm on programmable GPUs.

5 Results

We achieved real-time frame rates with our technique without sacrificing imagequality. As mentioned earlier, our technique’s frame rate does not reduce as morelights are added to the scene, unlike direct evaluation of the diffraction BRDF.Figure 5 shows our final results, which were rendered on an Nvidia 6600 GTgraphics card with 128 Megs of RAM on a Pentium 4 2.8 GHz Personal Computerwith 1GB RAM. For our CD-ROM model that had 25 thousand vertices, we wereable to maintain a constant frame rate of 65 Frames Per Second (FPS). Withseveral additional effects such as environment mapping, blur, exposure control,and vignette, our scene still ran at over 60 FPS (real-time frame rate).

Limitations of our work: Precomputation time is a factor to consider when em-ploying our technique. Additional computation time is required for spectral con-version making our diffraction BRDF significantly longer to precompute thanother BRDFs [9, 8, 11] that are not wavelength-dependent. We see this as anarea for considerable improvement as other wavelength-based phenomena mayrequire more time for precomputation.

Our use of SH made spherical mapping an obvious choice for parameterizationduring precomputation but this resulted in too few samples at the equators.Storing the precomputed diffraction response using parabolic mapping [17] wouldproduce more evenly distributed samples and result in fewer sampling errors.Additionally, highlights in the lighting directions exhibit an exponential fall off asthe diffractive response moves away from the light direction. The high frequencynature of the fall off requires higher sampling than SH can achieve with loworder coefficients (≤ 25) which can lead to approximation errors. In future,different sampling techniques, factorization, or high order approximation withcompression (for example PCA) may be able to capture the fall off exhibited byCDs with lower error.

Physically-Based Real-Time Diffraction Using Spherical Harmonics 515

Fig. 6. Our final images: The top row of images are achieved by directly evaluatingour physically-based diffraction BRDF with a single light rendering on the left andmulti-light rendering on the right. The bottom set of images have been rendered withour technique using a single light on the left and a multiple lights on the right. Eachof the SH approximated images (bottom) renderings run at the same frame rate eventhough there are 5 more lights in the multi-light version. Since SH Lighting is notaffected by the number of light sources, a single light rendering runs at the same speedas full hemisphere lighting. The scene frame rate reduces as more lights are added, ifthe diffraction BRDF is evaluated directly.

516 C. Lindsay and E. Agu

6 Conclusion and Future Work

We have demonstrated a real-time technique for rendering physically baseddiffraction using Spherical Harmonics for compact and efficient representation.Our technique preserves the accuracy of the diffraction for arbitrary surfaces,including unique intensities for each response maximum, without suffering froma slower render time as the number of light sources increase. We believe thatthis technique can expand the range of complex materials currently rendered inreal-time and enrich experience in many interactive applications such as virtualreality and video games.

In the future, we think that a normal progression of our technique would beto expand it to other wavelength dependent phenomena beyond iridescence anddiffraction type reflection. Some techniques that may benefit from our techniqueinclude dispersion, fluorescence, and wavelength dependent multiple scattering.

References

1. Stam, J., Diffraction Shaders. in Proc. ACM SIGGRAPH 1999.

2. Agu, E., Diffraction Shading Models for Iridescent Surfaces. in Proc. IASTED VIIP2002

3. Agu, E., Diffraction shading models in computer graphics. (PhD Dissertation, U.of Mass-Amherst).

4. Thorman, S., Diffraction based models for iridescent colors in computer generatedimagery. (PhD Dissertation, U. of Mass-Amherst).

5. Stroke, G. W., Diffraction Gratings, pp 426-754, Handbook Der Physik (Encyclo-pedia of Physics). Springer-Verlag, 1967.

6. Nakamae, E., Kaneda, K., Okamoto, T. and Nishita, T., A lighting model aimingat drive simulators. in Proc. ACM SIGGRAPH ’90, pg. 395-404

7. Lindsay, C., Agu, E., Wavelength dependent Rendering Using Spherical Harmonics.in Proc. Eurographics 2005

8. Sloan, P., Kautz, J., Snyder, J., Precomputed radiance transfer for real-time render-ing in dynamic, low-frequency lighting environments, ACM Transactions on Graph-ics, pg 527-536, 2002.

9. Ramamoorhthi, R., Hanrahan, P., Frequency Space Environment Map Rendering.in Proc. ACMSIGGRAPH 2002, pg 517-526.

10. Ramamoorthi, R., Hanrahan, P., An Efficient Representation for Irradiance Envi-ronment Maps. in Proc. ACM SIGGRAPH 2001, pg 497-500.

11. Kautz, J., Sloan, P., Snyder, J., Fast Arbitrary BRDF Shading for Low-FrequencyLighting Using Spherical Harmonics. in Proc. 13th Eurographics workshop on Ren-dering, pg 291-296, 2002.

12. Westin S., Arvo J., Torrance K., Predicting Reflectance Functions from ComplexSurfaces. in Proc. ACM SIGGRAPH 1992, pg 255-264.

13. Stam, J., Simulating Diffraction, Chapter 8, in GPU Gems, Addison Wesley, 2004.

14. Sun Y., Drew, M., S., AND Fracchia F.,D., Representing Spectral Functions bya Composite Model of Smooth and Spiky Components for Efficient Full-SpectrumPhotorealism. in Proc. IEEE Workshop on Photometric Modeling for ComputerVision and Graphics ’99, pp. 4-11.

Physically-Based Real-Time Diffraction Using Spherical Harmonics 517

15. Sun Y., A Spectrum-Based Framework for Realistic Image Synthesis. Phd disser-tation, Simon Fraser University, 2000

16. Sun Y., Fracchia F. D., Drew M. S., and Calvert T.W., ”Rendering IridescentColors of Optical Disks,” Simon Fraser Univ., Tech. Report SFU CMPT TR 1999-08, 1999.

17. Heidrich W.,View-Independent Environment Maps. in Proc. Eurographics/SIGGRAPH Workshop on Graphics Hardware ’98’.