28.09.2016
I wanted to learn more about spherical harmonics and their use for diffuse light probes in real-time rendering applications.
Even though I read about them in the third edition of "Real-Time Rendering" at some point,
I had to refresh my knowledge and I actually wanted to implement something to get some practice.
I looked up some additional resources and tried to find out how the data is usually precomputed.
Spherical Harmonics for Beginners
lists many good resources that really helped me a lot.
It seems that third order spherical harmonics are usually precise enough to model lambertian diffuse lighting.
Nine coefficients fully describe the diffuse lighting from all directions.
They are passed to a shader, where they are applied based on the surface normal.
The third order spherical harmonics coefficients consist of one coefficient for the first order, three for the second order and five for the third order functions.
Each coefficient can be an LDR or HDR color value.
This is all that needs to be done to apply an SH probe in a shader:
vec3 result = vec3(0.0);
result += coefficients[0] * 0.282095;
result -= coefficients[1] * 0.488603 * n.y;
result += coefficients[2] * 0.488603 * n.z;
result -= coefficients[3] * 0.488603 * n.x;
result += coefficients[4] * 1.092548 * n.x * n.y;
result -= coefficients[5] * 1.092548 * n.y * n.z;
result += coefficients[6] * 0.315392 * (3.0f * n.z * n.z - 1.0f);
result -= coefficients[7] * 1.092548 * n.x * n.z;
result += coefficients[8] * 0.546274 * (n.x * n.x - n.y * n.y);
As you may have noticed, the coefficients can even be premultiplied by those constants.
It may seem obvious by looking at the first few lines, that the first coefficient is basically an ambient lighting term and
the following three coefficients add and subtract colors on each axis based on the three components of the normal vector.
But what is the impact of the last five (the third order) coefficients on the result?
There are usually images in the literature that show the lobes for the positive and negative spaces for each coefficient.
But I was still slightly confused and quickly built an application that lets me interactively fiddle with those coefficients, which was very eye-opening.
The code is availible on my GitHub page and should be easy to build on all the primary platforms:
spherical_harmonics_playground.
After playing around a bit with the coefficients, I wanted to precompute some real lighting data.
I read about several typical ways to precompute coefficients.
They are all based on applying a similar computation as above but for each incoming light direction to accumulate the coefficients:
vec3 coefficients[9] = zeroes;
for (each incident light ray with direction n and intensity or color c)
{
coefficients[0] += c * 0.282095;
coefficients[1] -= c * 0.488603 * n.y;
coefficients[2] += c * 0.488603 * n.z;
coefficients[3] -= c * 0.488603 * n.x;
coefficients[4] += c * 1.092548 * n.x * n.y;
coefficients[5] -= c * 1.092548 * n.y * n.z;
coefficients[6] += c * 0.315392 * (3.0f * n.z * n.z - 1.0f);
coefficients[7] -= c * 1.092548 * n.x * n.z;
coefficients[8] += c * 0.546274 * (n.x * n.x - n.y * n.y);
}
This is especially trivial for directional light sources.
Light rays can also be generated with a typical ray tracing approach with Monte Carlo integration.
But I went with the conversion of environment maps to SH light probes.
In my application I load a cube map, generate light rays for its pixels and
weight these based on their distance from the unit sphere and their approximate size when projected onto the sphere.
I was actually really surprised that this was all I had to do.
It was so easy that I was able to read up, understand and implement everything in a single evening :).
I want to thank all the authors for that!
This is how some of my results look like: