Created for the Computer Graphics Competition 2019 at Saarland University.

Interactive WebAssembly Version

One says an image is worth more than thousand words. Pursuing this path even further, I thought that running the ray tracer at interactive framerates would be worth more than a thousand images. With the help of Emscripten, I ported the engine to WebAssembly. In order to make use of multiple cores, the wrapper creates multiple web workers that receive jobs from the main program. When standing still, a high-quality (720p) rendering is triggered.

Warning: Running this may crash your browser if your machine is not powerful enough.

Action: Move Height Time Randomize
Keys: WASD EQ RF G

Note that this version consumes way more memory than the native one, since each web worker has to load the scene and textures independently. I could have used SharedArrayBuffer to reduce memory usage (and probably improve performance a little bit), however this is currently disabled by default on most browsers due to security reasons.

Main Image

Main Image

The Scene

Right from the beginning, I knew that I wanted to create a beautiful outdoor scene. Since the weather in Saarbrücken was not really comfy in the weeks leading up to the submission deadline, I instinctively pivoted towards a tropical island. Although having great digital weather was not really a good substitute, it still allowed for some nice summer feelings.

I also had the idea to create an interactive WebAssembly version for the project website. This also allowed me to make the scene very dynamic. Not only did I make the viewing perspective customizable, but instead Tropical Island of having a fixed time of day, I designed an entire day-night cycle. And while I was at it, I discarded the pre-made heightmap and introduced procedural terrain generation and object placement. All of this actually brought up a problem: It was hard to select a single image to represent my scene. I experimented with different solutions, such as rendering a tryptich (an artwork divided into three parts, showing different aspects of a fixed topic) like the one on the right, but ultimately rejected this idea and put the focus on the original feeling that I wanted to convey.

Mip Mapping

Creating a scene that looks great wherever the camera is positioned spawns many challenges. The sand textures are very detailed, which would introduce aliasing. Since I wanted to archive interactive frame-rates, I could not just throw expensive sampling at the problem. Instead, I implemented mip mapping.

When mip mapping is requested, we create multiple copies of the original texture, that are all half the size of the previous one, that is: Four pixels are averaged into one. When sampling the texture, we choose two of these mip maps, sample them and blend the results (i.e. trilinear filtering). Note that the image below does not have full resolution and was rendered without multisampling, to simulate the effect of mip mapping in realtime scenarios.

Mip Mapping Off
Mip Mapping On

Time of Day

The scene is basically surrounded by an infinitely large sphere. The ray direction is used to compute how a given point should look like. To create a convincing sun, I am using the dot product of sun direction and view direction, add a very small number (basically the radius of the sun) Tropical Island and raise the sum the power of a large number. After clamping the value, this gives a bright disc that quickly fades into the background.

The clouds are clamped perlin noise. To make the clouds at the horizon smaller, we divide the view direction by its \(z\) component and use that to look up our noise value (which increases the frequency as \(z\) becomes smaller). Tropical Island Additionally, we add an offset vector before lookup to be able to shift the clouds as time passes. They are finally blended together with the sky background color, which of course also depends on the current time, and the sun.

To make the night scenes appealing as well, I wanted to add stars. This was harder to implement than the above, since stars are typically very small Tropical Island but really bright, which leads to high-frequency content and hence aliasing in the resulting image. To combat this problem, I devised the following algorithm. Start by dividing the sky sphere into chunks of approximately equal size by using spherical coordinates. Each chunk is assigned a certain brightness that determines if it contains a star and how bright it would be. We then take the distance to the center of the nearest chunk and raise it to a power that depends on the resolution of the image. This lets the stars appear bigger but darker on lower resolutions, which is not totally unrealistic. This solves the aliasing problem. The result is multiplied with the chunk's brightness. Additionally, we can easily rotate the star using spherical coordinates. Note that the stars being visible during sunrise and sunset is not on accident, I just thought that it looks nice.

Raymarched Terrain

Raymarching is not only used to render the actual island terrain, but also the water as well. When talking about terrain here, I mean both. Figuring out the intersection point between a ray and the terrain turned out to have a great impact on performance, e.g. looking into the sky gives a huge framerate boost.

We begin by intersecting the ray with the bounding box of the terrain object to get a minimum intersection distance. We then continue from that point forwards in small steps along the ray. The step size depends on the height difference to the terrain under the current position, such that large valleys are skipped quickly. We use numeric differentiation to compute a normal vector (which is later used for shading).

The material used by terrain of the island itself blends between two different sand textures to reduce tiling and has a third, quite reflective material at regions near the water to suggest wet sand. The water is a glass-like material. Additionally, I added foam near the beach using perlin noise.

Procedurality

The heightmap of the terrain is generated from a 15-octave perlin noise and cached as a bitmap. I also added a smooth falloff to actually always end up with an island-like heightmap. The seed changes the offset into the perlin noise. Above, you can see the result of the seeds 1 (which is also used by the other images), 2 and 3.

After generating the terrain, the objects are placed. Here, perlin noise produced bad results, so we add another layer of pseudo-randomness using the typical fract() after multiplying large numbers. Additionally, we consider the height of a given terrain point, such that for example a boat does not spawn in the middle of the island. I tried to pay attention to the details, e.g. randomizing object orientation or aligning the boat with the coastline (see the right image, which was taken with seed 2).

Miscellaneous

Using actual geometry for the palm leaves would be expensive to model and render. We therefore employ alpha masking to discard intersections on transparent regions. The fog seen in the distance is exponential height fog, whose attenuation and emission (the later especially when the sun is near the horizon) is computed analytically. All images were gamma corrected. All images, except the ones under "Mip Mapping", were rendered using 5x5 stratified sampling. Rendering time for the 1920x1080 images is less than 500 seconds each. Without multisampling, this would less than 20 seconds. This means that there are more than 100,000 rays per second.

Credits

Free assets used: Tropical Palms, Wave Heightmap, Boat, Sand One, Sand Two, Rocks, Shark, Treasure Chest