In this tutorial we’ll discuss the ideas and concepts behind rendering water and then talk through some demo code.

By the end of this tutorial you should walk away with a good sense of how water is rendered as well as a reference implementation to serve as a jumping off point should you decide to dive into more complex water rendering techniques.

The full source code for the demo can be found on GitHub.

Using Rust + WebGL + WebAssembly

I asked and you decided!

Our past WebGL tutorials have all used JavaScript, but this time around we’ll be using Rust.

If you’ve never used Rust before don’t worry. All of the core concepts behind rendering water hold true no matter what language you’re using. We’re using WebGL but the same concepts hold for OpenGL or any other graphics API.

What goes into rendering water?

Our basic water renderer boils down to combining a few textures and tweaking numbers until we like what we see.

Click or touch and drag to move the camera. Use your mouse wheel to zoom in and out of the scene.

We first render the scene without the water and store this in a framebuffer that we call our refraction framebuffer. A framebuffer is a buffer that contains all of the data that we need to render a frame.

The refraction framebuffer tells us everything that is under the water.

In the real world, light from under the water is refracted, or deflected as it passes from the water into the air and to your eye. Hence the name refraction buffer.

At the top left of the interactive scene above you should see a rendering of the color texture that is attached to our refraction framebuffer.



Next we take the camera’s y position, multiply it by -1 while ensuring that we’re still looking at the same point and then re-render the scene. You can see an illustration of this at the top right of the demo above. This is our reflection framebuffer.

As you might guess from the name this is everything that could be reflected from the water’s surface and into the camera.

When rendering the water we blend the refraction and reflection textures onto the water quads surface. If you disable the Use Refraction and Use Reflection checkboxes in the demo you’ll see that in action.

The way that the water appear to move comes from slightly altering the point on the refraction and reflection textures that we sample from every frame.

These slight offsets comes from sampling a du/dv map, which is just a texture that encodes different x and y offsets.

Du/dv map Our du/dv map encodes small x/y offsets that will be used to distort our water’s surface textures, simulating waves.

We use a corresponding normal map to make the water appear to not be flat and for our specular highlights.

Normal Map Our normal map gives our water quad more varried surface normals, allowing for better specular light simulation.

There are more advanced techniques that aid in simulating water with height and normals and waves such as distorting the actual quad to simulate waves with real heights.



Now that you’re a little more familiar with rendering water we’ll talk through the key pieces of the code that powers the water demo.

A good way to learn is to first follow along as we talk through the major pieces of the code, then clone the source code, run it locally and play around with the code and see your changes in the browser.

Walking through the demo code

#!/bin/bash

# File: build.sh

cd $(dirname $0)

# ./build.sh
if [ -z "$RELEASE"  ]; then
  # --------------------------------------------------
  # DEVELOPMENT BUILD
  # --------------------------------------------------

  RUST_BACKTRACE=1 cargo build --target wasm32-unknown-unknown
  wasm-bindgen ./target/wasm32-unknown-unknown/debug/webgl_water_tutorial.wasm --out-dir . --no-typescript --no-modules

# RELEASE=1 ./build.sh
else

  # --------------------------------------------------
  # RELEASE BUILD
  # --------------------------------------------------

  cargo build --target wasm32-unknown-unknown --release &&
  wasm-bindgen ./target/wasm32-unknown-unknown/release/webgl_water_tutorial.wasm --out-dir . --no-typescript --no-modules &&
  wasm-opt -O3 -o optimized.wasm webgl_water_tutorial_bg.wasm  &&
  mv optimized.wasm webgl_water_tutorial_bg.wasm
fi

Our build.sh script is used to compile our application and generate a .wasm file. We first use cargo to build our app for the wasm32 (32 bit WebAssembly) target.

The wasm-bindgen library annotated this .wasm file with different bits of information that will help us automatically generate the JS shims that we need for things such as accessing the DOM (in the future WebAssembly will have native DOM support without JS shims).

We then run the wasm-bindgen cli on our webgl_water_tutorial.wasm file in order to generate this webgl_water_tutorial.js shim file and a new webgl_water_tutorial_bg.wasm file with all of wasm-bindgen’s annotations removed.

We’ll later serve these two files to the client in our index.html file.



// File: build.rs

// ... trimmed ...

fn main() {
    let blender_files = vec!["./terrain.blend".to_string(), "./bird.blend".to_string()];

    // Only re-run this build script if we change our blender file
    for blender_file in blender_files.iter() {
        println!("{}", format!("cargo:rerun-if-changed={}", blender_file));
    }

    // ... trimmed ...

    let blender_stdout = landon::export_blender_data(&blender_files).unwrap();

    let meshes_by_file = blender_mesh::parse_meshes_from_blender_stdout(&blender_stdout).unwrap();
    let flattened_meshes = blender_mesh::flatten_exported_meshes(&meshes_by_file).unwrap();
    let flattened_meshes = bincode::serialize(&flattened_meshes).unwrap();

    let mut f = File::create("./meshes.bytes").unwrap();
    f.write_all(&flattened_meshes[..]).unwrap();

    // ... trimmed ...

    let mut f = File::create("./armatures.bytes").unwrap();
    f.write_all(&flattened_armatures[..]).unwrap();
}

A major component of of rendering water is rendering the reflections of the scenery around the water. We use Blender to create some scenery, and then a build.rs script that runs landon to export our scenery into bytes that we can download and deserialize on the client side.

This means that you can make changes to the .blend files and the next time that you build your application your application’s 3d models will be automatically updated to reflect your new changes.



<!-- File: index.html -->

<html>
  <head>
    <meta content="text/html;charset=utf-8" http-equiv="Content-Type"/>
  </head>
  <body>
    <div id="webgl-water-tutorial"></div>

    <script src='/webgl_water_tutorial.js'></script>

    <script>
      window.addEventListener('load', function () {
        window.wasm_bindgen('/webgl_water_tutorial_bg.wasm')
          .then(function () {
            // Start our rust application. You can find `WebClient` in `src/lib.rs`
            const webClient = new window.wasm_bindgen.WebClient()
            webClient.start()

            let time = Date.now();
            function render () {
              const dt = Date.now() - time

              webClient.update(dt)
              webClient.render()
              window.requestAnimationFrame(render)

              time = Date.now()
            }

            render()
          })
      })
    </script>
  </body>
</html>

index.html is where the story begins from the viewer’s perspective.

We create an empty div with id webgl-water-tutorial that our application will later embed itself into.

We then load up the webgl_water_tutorial.js file that we created earlier. Then we download and initialize our WebAssembly module using the window.wasm_bindgen helper function.

Once it’s been initialized we create a new instance of our Rust application and then begin the request animation frame loop that will render the application.

// File: src/lib.rs

// ... trimmed ...

/// Used to run the application from the web
#[wasm_bindgen]
pub struct WebClient {
    app: Rc<App>,
    gl: Rc<WebGlRenderingContext>,
    renderer: WebRenderer,
}

#[wasm_bindgen]
impl WebClient {
    /// Create a new web client
    #[wasm_bindgen(constructor)]
    pub fn new() -> WebClient {
        console_error_panic_hook::set_once();

        let app = Rc::new(App::new());

        let gl = Rc::new(create_webgl_context(Rc::clone(&app)).unwrap());
        append_controls(Rc::clone(&app)).expect("Append controls");

        let renderer = WebRenderer::new(&gl);

        WebClient { app, gl, renderer }
    }

    /// Start our WebGL Water application. `index.html` will call this function in order
    /// to begin rendering.
    pub fn start(&self) -> Result<(), JsValue> {
        let gl = &self.gl;

        load_texture_image(
            Rc::clone(gl),
            "/dudvmap.png",
            TextureUnit::Dudv,
        );
        load_texture_image(
            Rc::clone(gl),
            "/normalmap.png",
            TextureUnit::NormalMap,
        );
        load_texture_image(
            Rc::clone(gl),
            "/stone-texture.png",
            TextureUnit::Stone,
        );

        Ok(())
    }

    /// Update our simulation
    pub fn update(&self, dt: f32) {
        self.app.store.borrow_mut().msg(&Msg::AdvanceClock(dt));
    }

    /// Render the scene. `index.html` will call this once every requestAnimationFrame
    pub fn render(&mut self) {
        self.renderer
            .render(&self.gl, &self.app.store.borrow().state, &self.app.assets());
    }
}

Our application is a single Rust crate with all of the code in the src folder.

src/lib.rs is the entry point into our Rust application. You’ll recognize the WebClient struct since we made use of it in index.html. Towards the bottom we expose a render function that gets called every requestAnimationFrame from index.html.

We could’ve started the requestAnimationFrame loop from the Rust side but I didn’t learn how to do that until after I started working on this tutorial and I didn’t feel like changing it. Maybe next time.



// File: src/shader/water-vertex.glsl

attribute vec2 position;

uniform mat4 perspective;
uniform mat4 model;
uniform mat4 view;

uniform vec3 cameraPos;
varying vec3 fromFragmentToCamera;

varying vec4 clipSpace;
varying vec2 textureCoords;

const float tiling = 4.0;

void main() {
    vec4 worldPosition = model * vec4(position.x, 0.0, position.y, 1.0);

    clipSpace = perspective * view *  worldPosition;

    gl_Position = clipSpace;

    textureCoords = position + 0.5;
    textureCoords = textureCoords * tiling;

    fromFragmentToCamera = cameraPos - worldPosition.xyz;
}

Our water’s vertex shader is mainly used to pass data to the fragment shader where most of the interesting things happen.

The first key piece of our water’s vertex shader is the varying clipSpace;.

We’ll use the clipSpace coordinates in our fragment shader in order to figure out the water fragments x and y location in the screen between 0 and 1.

From there we’ll use those x and y values to sample from our refraction and reflection textures. This technique is known as projective texture mapping.

From there we’ll set our water’s texture coordinates that we’ll use for samping the du/dv map and normal map. Since our water’s position vector values range from -0.5 to 0.5 we’ll just add 0.5 to make them range from 0 to 1.

This means that our bottom left corner of our water is (0, 0) and our top right corner is (1, 1).

Instead of deriving the values you could just as easily have just passed them in as an attribute. Since we’re dealing with a single quad the calculation vs. memory trade-off isn’t relevant at all.

The fromFragmentToCamera = cameraPos - worldPosition.xyz; is responsible for setting the varying vector that points from this current fragment of water to our camera. This will be used for our lighting calculations as well as our Fresnel effect simulation.

Water Fragment Shader

Next we’ll do a deep dive into the water’s fragment shader. Read over the entire fragment shader once or twice, and then we’ll talk through the major pieces.

// File: src/shader/water-fragment.glsl

precision mediump float;

uniform sampler2D refractionTexture;
uniform sampler2D reflectionTexture;
uniform sampler2D dudvTexture;
uniform sampler2D normalMap;
uniform sampler2D waterDepthTexture;

vec3 sunlightColor = vec3(1.0, 1.0, 1.0);
vec3 sunlightDir = normalize(vec3(-1.0, -1.0, 0.5));

varying vec3 fromFragmentToCamera;

// Changes over time, making the water look like it's moving
uniform float dudvOffset;

varying vec4 clipSpace;

varying vec2 textureCoords;

const float waterDistortionStrength = 0.03;
const float shineDamper = 20.0;

uniform float waterReflectivity;
uniform float fresnelStrength;

vec4 shallowWaterColor =  vec4(0.0, 0.1, 0.3, 1.0);
vec4 deepWaterColor = vec4(0.0, 0.1, 0.2, 1.0);

vec3 getNormal(vec2 textureCoords);

void main() {
    // Normalized device coordinates - Between 0 and 1
    vec2 ndc = (clipSpace.xy / clipSpace.w) / 2.0 + 0.5;

    vec2 refractTexCoords = vec2(ndc.x, ndc.y);
    // Reflections are upside down
    vec2 reflectTexCoords = vec2(ndc.x, -ndc.y);

    float near = 0.1;
    float far = 50.0;

    // Get the distance from our camera to the first thing under this water fragment that a
    // ray would collide with. This might be the ground, the under water walls, a fish, or any
    // other thing under the water. This distance will depend on our camera angle.
    float cameraToFirstThingBehindWater = texture2D(waterDepthTexture, refractTexCoords).r;
    // Convert from our perspective transformed distance to our world distance
    float cameraToFirstThingUnderWater = 2.0 * near * far /
     (far + near - (2.0 * cameraToFirstThingBehindWater - 1.0)
      * (far - near));

    float cameraToWaterDepth = gl_FragCoord.z;
    float cameraToWaterDistance = 2.0 * near * far / (far + near - (2.0 * cameraToWaterDepth - 1.0) * (far - near));

    float angledWaterDepth = cameraToFirstThingUnderWater - cameraToWaterDistance;

    vec2 distortedTexCoords = texture2D(dudvTexture, vec2(textureCoords.x + dudvOffset, textureCoords.y)).rg * 0.1;
    distortedTexCoords = textureCoords + vec2(distortedTexCoords.x, distortedTexCoords.y + dudvOffset);

    // Between -1 and 1
    vec2 totalDistortion = (texture2D(dudvTexture, distortedTexCoords).rg * 2.0 - 1.0)
     * waterDistortionStrength;

    refractTexCoords += totalDistortion;
    reflectTexCoords += totalDistortion;

    // Prevent out distortions from sampling from the opposite side of the texture
    // NOTE: This will still cause artifacts towards the edges of the water. You can fix this by
    // making the water more transparent at the edges.
    // @see https://www.youtube.com/watch?v=qgDPSnZPGMA
    refractTexCoords = clamp(refractTexCoords, 0.001, 0.999);
    reflectTexCoords.x = clamp(reflectTexCoords.x, 0.001, 0.999);
    reflectTexCoords.y = clamp(reflectTexCoords.y, -0.999, -0.001);

    vec4 reflectColor = texture2D(reflectionTexture, reflectTexCoords);

    vec4 refractColor = texture2D(refractionTexture, refractTexCoords);

    // The deeper the water the darker the color
    refractColor = mix(refractColor, deepWaterColor, clamp(angledWaterDepth/10.0, 0.0, 1.0));

    vec3 toCamera = normalize(fromFragmentToCamera);

    vec3 normal = getNormal(distortedTexCoords);

    // Fresnel Effect. Looking at the water from above makes the water more transparent.
    float refractiveFactor = dot(toCamera, normal);

    // A higher fresnelStrength makes the water more reflective since the
    // refractive factor will decrease
    refractiveFactor = pow(refractiveFactor, fresnelStrength);

    vec3 reflectedLight = reflect(normalize(sunlightDir), normal);
    float specular = max(dot(reflectedLight, toCamera), 0.0);
    specular = pow(specular, shineDamper);
    vec3 specularHighlights = sunlightColor * specular * waterReflectivity;

    gl_FragColor = mix(reflectColor, refractColor, refractiveFactor);
    // Mix in a bit of blue so that it looks like water
    gl_FragColor = mix(gl_FragColor, shallowWaterColor, 0.2) + vec4(specularHighlights, 0.0);
}

vec3 getNormal(vec2 textureCoords) {
    vec4 normalMapColor = texture2D(normalMap, textureCoords);
    float makeNormalPointUpwardsMore = 2.6;
    vec3 normal = vec3(
      normalMapColor.r * 2.0 - 1.0,
      normalMapColor.b * makeNormalPointUpwardsMore,
      normalMapColor.g * 2.0 - 1.0
    );
    normal = normalize(normal);

    return normal;
}

And now the main pieces of the water fragment shader:

vec2 ndc = (clipSpace.xy / clipSpace.w) / 2.0 + 0.5;

We need to know which pixel on our refraction and reflection textures corresponds to the water fragment, so we’ll use the clip space coordinate (ranging from -1 to 1) and convert it into a texture coordinates (ranging from 0 to 1). This is known as projective texture mapping.

vec2 refractTexCoords = vec2(ndc.x, ndc.y);
vec2 reflectTexCoords = vec2(ndc.x, -ndc.y);

Here we find the point on our refraction and reflection coordinates that corresponds to this water fragment.

Note that we negate the y coordinate of the reflection texture coordinate. If you scroll up and look at the reflection texture (top right corner of demo) you’ll be able to better visualize why.

float cameraToWaterDepth = gl_FragCoord.z;
float cameraToWaterDistance = 2.0 * near * far / (far + near - (2.0 * cameraToWaterDepth - 1.0) * (far - near));

float angledWaterDepth = cameraToFirstThingUnderWater - cameraToWaterDistance;

Here we determine the distance from the camera to the first thing below the water.

If our camera is at an angle this won’t necessary be the water’s floor, it could be the side of the terrain under the water.

We can use this depth for different effects, such as making water that is closer to the shore more transparent.

float refractiveFactor = dot(toCamera, normal);
refractiveFactor = pow(refractiveFactor, fresnelStrength);
gl_FragColor = mix(reflectColor, refractColor, refractiveFactor);

Here we decide how much of the refraction texture to show vs. the reflection texture. We base this on the Fresnel effect, which means that looking at the water from above makes it more transparent and looking at the water from the sides makes it more reflective.



// File: src/shader/mesh-non-skinned-fragment.rs

// ... trimmed ...

void main(void) {
    if (dot(worldPosition, clipPlane) < 0.0) {
        discard;
    }

    // ... trimmed ...
}

If you look at the refraction texture (top left of demo scene) and the reflection texture (top right of demo scene) you’ll see that the refraction texture is only showing things below the water and the reflection texture is only showing things above the water.

With OpenGL you would accomplish this using gl_ClipDistance, but WebGL provides no such mechanism.

Instead, we determine whether or not the fragment is on the wrong side of our clipping plane in the fragment shader, and if it is we discard it.

// File: src/render/mod.rs

fn render_reflection_fbo(
    &mut self,
    gl: &WebGlRenderingContext,
    state: &State,
    assets: &Assets,
) {
    let Framebuffer { framebuffer, .. } = &self.reflection_framebuffer;
    gl.bind_framebuffer(GL::FRAMEBUFFER, framebuffer.as_ref());

    gl.viewport(0, 0, REFLECTION_TEXTURE_WIDTH, REFLECTION_TEXTURE_HEIGHT);

    gl.clear_color(0.53, 0.8, 0.98, 1.);
    gl.clear(GL::COLOR_BUFFER_BIT | GL::DEPTH_BUFFER_BIT);

    if state.water().use_reflection {
        let clip_plane = [0., 1., 0., -WATER_TILE_Y_POS];
        self.render_meshes(gl, state, assets, clip_plane, true);
    }
}

As shown above, we’ll pass the clip_plane into our mesh renderer and it’ll make its way into the fragment shader and power our clipping.



// File: src/render/water_tile.rs

impl<'a> Render<'a> for RenderableWaterTile<'a> {
    // ... trimmed ...

    fn buffer_attributes(&self, gl: &WebGlRenderingContext) {
        let shader = self.shader();

        let pos_attrib = gl.get_attrib_location(&shader.program, "position");
        gl.enable_vertex_attrib_array(pos_attrib as u32);

        // These vertices are the x and z values that create a flat square tile on the `y = 0`
        // plane. In our render function we'll scale this quad into the water size that we want.
        // x and z values, y is omitted since this is a flat surface. We set it in the vertex shader
        let vertices: [f32; 8] = [
            -0.5, 0.5, // Bottom Left
            0.5, 0.5, // Bottom Right
            0.5, -0.5, // Top Right
            -0.5, -0.5, // Top Left
        ];

        let mut indices: [u16; 6] = [0, 1, 2, 0, 2, 3];

        RenderableWaterTile::buffer_f32_data(&gl, &vertices, pos_attrib as u32, 2);
        RenderableWaterTile::buffer_u16_indices(&gl, &mut indices);
    }

    fn render(&self, gl: &WebGlRenderingContext, state: &State) {
        let shader = self.shader();

        let model_uni = shader.get_uniform_location(gl, "model");
        let view_uni = shader.get_uniform_location(gl, "view");
        let refraction_texture_uni = shader.get_uniform_location(gl, "refractionTexture");
        let reflection_texture_uni = shader.get_uniform_location(gl, "reflectionTexture");
        let dudv_texture_uni = shader.get_uniform_location(gl, "dudvTexture");
        let normal_map_uni = shader.get_uniform_location(gl, "normalMap");
        let water_depth_texture_uni = shader.get_uniform_location(gl, "waterDepthTexture");
        let dudv_offset_uni = shader.get_uniform_location(gl, "dudvOffset");
        let camera_pos_uni = shader.get_uniform_location(gl, "cameraPos");
        let perspective_uni = shader.get_uniform_location(gl, "perspective");
        let water_reflectivity_uni = shader.get_uniform_location(gl, "waterReflectivity");
        let fresnel_strength_unit = shader.get_uniform_location(gl, "fresnelStrength");

        let pos = (0., 0., 0.);

        let x_scale = 18.;
        let z_scale = 18.;
        let scale = Matrix4::new_nonuniform_scaling(&Vector3::new(x_scale, 1.0, z_scale));

        let model = Isometry3::new(Vector3::new(pos.0, pos.1, pos.2), nalgebra::zero());
        let model = model.to_homogeneous();
        let model = scale * model;
        let mut model_array = [0.; 16];
        model_array.copy_from_slice(model.as_slice());
        gl.uniform_matrix4fv_with_f32_array(model_uni.as_ref(), false, &mut model_array);

        let mut view = state.camera().view();
        gl.uniform_matrix4fv_with_f32_array(view_uni.as_ref(), false, &mut view);

        // ... trimmed ...

        gl.enable(GL::BLEND);
        gl.blend_func(GL::SRC_ALPHA, GL::ONE_MINUS_SRC_ALPHA);

        gl.draw_elements_with_i32(GL::TRIANGLES, 6, GL::UNSIGNED_SHORT, 0);

        gl.disable(GL::BLEND);
    }
}

src/render/water_tile.rs is where we pass all of our uniforms into our water shader. It is here that you’d do much of your experimenting if you try to extend this tutorial with more uniforms. For example, you might add a foam texture and sample it when the water is more shallow.

Before rendering we enable alpha blending, even though we aren’t actually making use of it in this tutorial.

If you look carefully you’ll sometimes see a little fuzziness near the edges of the water.

The water’s depth can also be used for fixing this fuzziness artifact by making the edges of the water more transparent. While we don’t do this in this demo, but ThinMatrix’s soft water edges tutorial explains how to accomplish this in your fragment shader.

Going forwards

You can find the rest of the demo’s code on GitHub.

If you have any questions or run into any challenges please feel free to open an issue and I’ll get back to you ASAP!

In the meantime, scroll down and join the mailing list and shoot me an email if there’s anything that you’d like to learn more about!

Til’ next time,

- CFN