This post summarizes the results I obtained from developing a custom Cg/HLSL shader for use in Unity applications that renders the Earth. The shader implements real-time ray traced, atmospheric scattering from space, modeling a mathematical approximation of the aerial-perspective phenomenon based on the paper described in “GPU Pro 3“, titled “An approximation to the Chapman Grazing-Incidence Function for Atmospheric Scattering“, authored by Christian Schüler.
Note: Please refer to the description and implementation details published in “GPU Pro 3” to learn about this technique. In my blog and portfolio, I’m only showing my results and note places where I deviated from the article.
Below are frame captures from the running simulation, which uses 4096 x 2048 texture maps for land/ocean and clouds as well as an ‘ocean mask’ texture of 1024 x 512.
For this algorithm, all of the lighting calculations performed in the fragment shader must be in model (or object) space. When integrating this shader into Unity, it was necessary to transform the light direction (or light position) and eye position (camera) to model space, which I performed in the vertex shader. Doing this allowed me to move the light and camera programmatically at run-time using the Unity editor, in-game user interface, or another game controller script.
The cloud map is added in with the final color calculation as a separate term. I calculate that term in a similar manner as was done for the surface color, including using the calculated light color and transmittance factor. My goal here was really just to add in a cloud layer, however, it is being rendered at Earth surface, so it won’t look right in low orbits.
After generating the above images of the full-sized Earth, I decided to add in a 4096 x 2048 normal map so that it might look better from a closer orbit. In adding the normal map (for mountains, etc), I needed to calculate lighting once using N-dot-L for the land rendering using the normal map value, then recalculate N-dot-L using the spherical model normal value for the rest of the pass (i.e. to perform aerial-perspective).
I also exposed to the application many of the shader properties for controlling the amount of aerial-perspective phenomenon one would see looking at the Earth from space. While all three of the images below are likely slight exaggerations, they illustrate the effectiveness of the algorithm.
Finally, below, I show a night-to-day sequence captured from animating the light/sun in a circular path, 23.5 degrees with respect to the orbit camera. The algorithm utilizes Rayleigh extinction coefficients and wavelength absorption factors to produce some neat sunrise/sunset effects.
These images above show a few more custom shaders I wrote for this Unity-based project, and modifications I made to the shader, deviating from what is described in “GPU Pro 3“. First, to the planet shader, I added alpha-blending in order to render background stars as well as the sun. In the fragment shader ray-sphere intersection calculation, I set the alpha to zero if the ray does not intersect the planet of given radius (not including the atmosphere), otherwise it is set to one. When the alpha is one, the planet blocks the starfield, but when it is zero, the background stars, and/or sun, can be visible by using blending function:
Blend One OneMinusSrcAlpha
Additionally, I forced the planet shader to render after the ‘transparent queue’ is rendered, so that it is being drawn after the star field.
My sun consists of a few point-flare textures on quads of slightly different scales, and I wrote a simple unlit, additive blending shader that also accepts and uses a color so that I could make the sun more orange or yellow, or whatever I desired.
The stars are rendered on the GPU, based on the technique in the ShaderX 2 book, titled “Screen-aligned Particles with Minimal VertexBuffer Locking”, authored by O’dell Hicks. I plan to write a separate post to describe it in detail. Using this approach, the shader is capable of rendering 16,384 billboards (stars) in a single draw call (and pass). You can also use a geometry shader to do this, which I’ll also explore another time.