WebGL 02: "Movement and Color"

In this tutorial, you'll learn how to use fragment shaders to color shapes, and move them around the canvas to create the illusion of motion.

I'll cover the following new concepts:

  1. Uniform Values, parameters that can be set between draw calls
  2. Fragment Shader Inputs, or varyings, which are like per-fragment blended attributes
  3. Vertex Array Objects (VAOs), which hold collections of vertex attribute bindings

Movement and Color Demo

The end result of this article is to write the program that does this:

Most of what's happening here isn't actually new. At a high level, rendering this app is pretty familiar (new ideas in bold):

  1. Prepare the frame (clear the canvas, set up rasterizer and output merger)
  2. Prepare a shader that handles color gradients
  3. Draw a whole bunch of shapes in a loop
    1. Set up input assembler state for whichever shape we want to draw
    2. Tell the shader where to draw this particular shape
    3. Dispatch a draw call

We'll also be covering a more modern approach to setting up the input assembler - Vertex Array Objects (VAOs)

Modern APIs (OpenGL 3.3+, Vulkan, WebGPU) actually require using VAOs or a comparable construct. Technically you never have to use VAOs when using WebGL - but they allow graphics drivers to behave more efficiently, and make picking up other APIs easier, so you should still use them unless you're severely allergic.

First, let's consider how to handle color gradients in a shader.

Fragment Shader Inputs

The fragment shader itself is pretty simple:

Fragment Shader
Expand
#version 300 es
precision mediump float;

in vec3 fragmentColor;

out vec4 outputColor;

void main() {
  outputColor = vec4(fragmentColor, 1.0);
}

There are two main differences from the last tutorial:

  1. Now there's an input variable fragmentColor
  2. The output value outputColor uses this input variable

Where does this input variable come from? Recall that the rasterizer, which identifies which pixels are inside a triangle. That triangle is defined by the primitive assembly stage, from three vertices output from the vertex shading stage.

Vertex shaders can output data beyond the built-in gl_Position! The value of extra vertex shader outputs are blended together for each pixel fragment the rasterizer considers, based on how close that fragment is to each of the three vertices that forms a given triangle.

Fragment shader inputs MUST match vertex shader outputs - WebGL checks this at the WebGLProgram linking stage. If there are mismatches here, gl.linkProgram() will generate an error that can be caught by the link error checking from the last tutorial.

Vertex Shader (draft 1)
Expand
#version 300 es
precision mediump float;

in vec2 vertexPosition;
in vec3 vertexColor;

out vec3 fragmentColor;

void main() {
  fragmentColor = vertexColor;
  gl_Position = vec4(vertexPosition, 0.0, 1.0);
}

Notice the added vertex shader output, which uses the same out syntax as the fragment shader output. Notice also that the type matches the fragment shader input (vec3).

To find the value of this vertex shader output, I'm pulling from another vertex shader input called vertexColor.

The rasterizer takes care of the dirty work for finding the input value for each fragment based on the vertex output values and where inside the triangle the current fragment is found.

If you're writing your own software rasterizer or just flat curious how it might work, here's a couple algorithms that can do this: Scratchapixel is a phenomenal website for learning computer graphics as well, I highly reccomend checking them out!

Uniform Values (per-shape parameters)

So fragment shader inputs can be used for the "color" part of "movement and color", but what of the movement? In order to move around shapes using only what I covered in the last tutorial, you'd have to upload a new vertex buffer every frame containing new clip space vertices. For a triangle with only 3 vertices that's not awful, but imagine doing that for a 3D player model with thousands of vertices!

This is where another shader input called a uniform value comes into play. Uniforms are also inputs, but are the same for every vertex and fragment in a draw call. They can be set by an application developer, and remain the same for all draw calls until they are set with a new value.

Here's an updated vertex shader that uses three new inputs to decide where a vertex should be placed:

Vertex Shader (final)
Expand
#version 300 es
precision mediump float;

in vec2 vertexPosition;
in vec3 vertexColor;

out vec3 fragmentColor;

uniform vec2 canvasSize;
uniform vec2 shapeLocation;
uniform float shapeSize;

void main() {
  fragmentColor = vertexColor;

  vec2 worldPosition = vertexPosition * shapeSize + shapeLocation;
  vec2 clipPosition = (worldPosition / canvasSize) * 2.0 - 1.0;

  gl_Position = vec4(clipPosition, 0.0, 1.0);
}

For this tutorial, shapes will be defined to fill up a bounding box that goes from [-1, 1] in both the X and Y dimensions - in clip space, this would fill the entire screen.

To calculate the vertex position with the new uniform values...

  1. Find the position on the surface (in pixels) by multiplying the input by the size of the shape, and adding some offset (worldPosition)
  2. Find the percentage across the canvas in either direction by dividing worldPosition by the size (in pixels) of the canvas
  3. Multiply this percentage by 2 and subtract 1 to find a clip space value from [-1, 1] instead of a percentage from [0, 1]

Starting in the next tutorial, we'll use the concept of linear algebra spaces a lot. It's helpful to think of coordinates as being defined according to some point of view:

  • Model space (the point of view of triangle geometry),
  • World space (the location within a simulation), and
  • Clip space (the location on the screen)

Each line of math in our vertex shader converts from one space into the next, ending in gl_Position's clip space that the rasterizer consumes.

You can see the effect updating uniforms on that shader would have in the demo area below:

vec2 shapeLocation = [150, 150];float shapeSize = 75;vec2 canvasSize = [300, 300];
Top vertex (0, 1) values:worldPosition = [150, 225]clipPosition = [0.000, 0.500]

Vertex Array Objects (VAOs)

VAOs are a collection of input assembler state, which hold a list of enabled vertex attributes and bindings to attribute slots.

Take the following code which might set up attributes for two shapes: a red triangle and a blue square:

Draw 2 Shapes (sample)
Expand
// (output merger, rasterizer, and program are already set up)
gl.enableVertexAttrib(positionAttribLocation);
gl.enableVertexAttrib(colorAttribLocation);

// Draw red triangle
gl.bindBuffer(gl.ARRAY_BUFFER, trianglePositions);
gl.vertexAttribPointer(
  positionAttribLocation, 2, gl.FLOAT, false, 0, 0);
gl.bindBuffer(gl.ARRAY_BUFFER, redTriangleColors);
gl.vertexAttribPointer(
  colorAttribLocation, 3, gl.FLOAT, true, 0, 0);
gl.bindBuffer(gl.ARRAY_BUFFER, null);
gl.drawArrays(gl.TRIANGLES, 0, 3);

// Draw blue square
gl.bindBuffer(gl.ARRAY_BUFFER, squarePositions);
gl.vertexAttribPointer(
  positionAttribLocation, 2, gl.FLOAT, false, 0, 0);
gl.bindBuffer(gl.ARRAY_BUFFER, blueSquareColors);
gl.vertexAttribPointer(
  colorAttribLocation, 3, gl.FLOAT, true, 0, 0);
gl.bindBuffer(gl.ARRAY_BUFFER, null);
gl.drawArrays(gl.TRIANGLES, 0, 6);

Both position and color attributes are set up for the red triangle, and then a draw call is dispatched to draw it. The same process is repeated for a blue square.

Here's the code to create a VAO for both the red triangle and blue square:

Create VAOs (sample - setup code)
Expand
// Red triangle
const redTriangleVao = gl.createVertexArray();
gl.bindVertexArray(redTriangleVao);
gl.enableVertexAttrib(positionAttribLocation);
gl.enableVertexAttrib(colorAttribLocation);
gl.bindBuffer(gl.ARRAY_BUFFER, trianglePositions);
gl.vertexAttribPointer(
  positionAttribLocation, 2, gl.FLOAT, false, 0, 0);
gl.bindBuffer(gl.ARRAY_BUFFER, redTriangleColors);
gl.vertexAttribPointer(
  colorAttribLocation, 3, gl.FLOAT, true, 0, 0);
gl.bindBuffer(gl.ARRAY_BUFFER, null);
gl.bindVertexArray(null);

// Blue square triangle
const blueSquareVao = gl.createVertexArray();
gl.bindVertexArray(blueSquareVao);
gl.enableVertexAttrib(positionAttribLocation);
gl.enableVertexAttrib(colorAttribLocation);
gl.bindBuffer(gl.ARRAY_BUFFER, squarePositions);
gl.vertexAttribPointer(
  positionAttribLocation, 2, gl.FLOAT, false, 0, 0);
gl.bindBuffer(gl.ARRAY_BUFFER, blueSquareColors);
gl.vertexAttribPointer(
  colorAttribLocation, 3, gl.FLOAT, true, 0, 0);
gl.bindBuffer(gl.ARRAY_BUFFER, null);
gl.bindVertexArray(null);

The same bindBuffer and vertexAttribPointer calls are made here, but during setup code instead of rendering code. A couple other things I want you to notice:

First - as soon as I'm done with a VAO, I immediately un-bind it by calling gl.bindVertexArray(null). This is technically unnecessary, since I'm immediately binding and using another VAO, but it's a good way to think. The active VAO captures and will re-play any and all input assembler state that is set while it is active! Forgetting to un-bind a VAO is a great way to accidentally introduce extremely hard to debug problems in a WebGL app.

Second, I'm calling enableVertexAttrib and vertexAttribPointer for both VAOs. VAOs capture both enabled attributes and vertex attribute setup.

VAO setup code is pretty nasty, but it's all part of setup - the performance critical rendering code gets significantly more simple:

VAO rendering (sample - rendering code)
Expand
gl.useProgram(colorShapeProgram);
gl.bindVertexArray(redTriangleVao);
gl.drawArrays(gl.TRIANGLES, 0, 3);

gl.bindVertexArray(blueSquareVao);
gl.drawArrays(gl.TRIANGLES, 0, 6);

gl.bindVertexArray(null);

While I was writing this tutorial, I had two bugs around leaking VAO bindings that took me an entire day to debug. It happens to everyone! WebGL has a lot of global state, and it's extremely easy to mess it up and introduce really difficult-to-debug bugs.

In my case, the render loop for the demo at the top of the page was leaking a VAO binding, which was screwed up by the "uniform" demo later overwriting the position attribute to be a triangle for every shape type (all demos on this page share a WebGL context).

The moral of the story: WebGL state is just like any other global state and very error-prone. Do what you can as an application developer to limit the scope of global mutations wherever possible, by un-binding buffers, VAOs, programs, etc. that you're no longer using!

The Render Loop

In the last tutorial, we generated a single image of a triangle and finished the program.

From here on out, we'll be creating animations that generate new images ("frames") as fast as possible to show to the user to create the illusion of motion.

All JavaScript graphics applications will have a structure more or less like this:

Basic App
Expand
function runApp() {
  loadStuff();

  function renderFrame() {
    handleUserInput();
    updateApp();
    renderThings();

    requestAnimationFrame(renderFrame);
  }
  requestAnimationFrame(renderFrame);
}

The special function requestAnimationFrame is a browser built-in function that calls the given function as soon as the user's device is ready to draw another image - this usually happens 30 or 60 times per second, but gaming-focused monitors may go much higher.

Expensive one-time setup code happens first - this includes setting up GPU buffers, VAOs, compiling WebGL programs, getting references to attribute locations and uniform locations, etc.

In this tutorial, we won't do anything for handleUserInput, since the demo is more or less a Windows 95 screensaver.

Application logic is updated first - in our case, we'll keep a list of shapes that need to be drawn, and update their positions at this point.

Once our shapes all have the correct calculated positions, we'll render each individual shape - most stages will remain the same frame-to-frame, so the majority of work here will be setting the correct uniform values and VAOs for each shape.

Once the app is updated and rendered, requestAnimationFrame is invoked again to ask the browser to repeat the same process as soon as the user's monitor is ready to display another frame, on and on... forever (or at least until the user navigates away).

TypeScript

I'll be using TypeScript for the rest of this tutorial (and future WebGL tutorials). You can follow along in JavaScript with pretty minor changes, or install TypeScript by following the instructions at typescriptlang.org.

The Code!

For this tutorial, use the same index.html file from the last tutorial, with the script name changed to movement-and-color.js. If you don't have it, here it is again (click to expand the code block):

index.html
Expand
<!doctype html>

<html lang="en">
  <head>
    <meta charset="utf-8">
    <meta name="viewport" content="width=device-width, initial-scale=1">

    <title>WebGL 2023 - 02 Motion and Color</title>

    <style>
      html, head, body {
        margin: 0;
        padding: 25px;
        background-color: #2a2a2a;
      }

      #demo-canvas {
        width: 800px;
        height: 800px;

        /*
         * WebGL trick: pick an offensive canvas background color to test
         *  opacity/clearing bugs
         */
        background-color: #da6052;

        /* Help to illustrate the rasterizer's job: when artificially lowering resolution, show pixelated image */
        image-rendering: crisp-edges;
      }

      /** CSS to make a nice little error box output for debugging (nice if you can't use devtools, e.g. mobile)  */
      #error-box {
        color: #fd8080;
        font-weight: 500;
        font-size: 18pt;
        border: 1px solid white;
        padding: 25px;
        margin-top: 20px;
      }

      .error-box-title {
        color: #eee;
        border-bottom: 1px solid gray;
      }
    </style>
  </head>
  <body>
    <canvas id="demo-canvas" width="800px" height="800px">
      <!-- This message shows up only if the Canvas element isn't supported -->
      HTML5 canvas not supported in your browser! These demos will not work.
    </canvas>

    <!--
      Doing a little error output in the HTML is nice when testing on mobile,
       where you don't have access to console.log.
    -->
    <div id="error-box">
      <span class="error-box-title">Error messages, if any, go in here.</span>
    </div>
    <script src="movement-and-color.js"></script>
  </body>
</html>

To use TypeScript, you also need a tsconfig.json file. I'm using a very basic one, since I'm really not using a lot of fancy TypeScript configuration for these tutorials:

tsconfig.json
Expand
{
  "compilerOptions": {
    "target": "es2016",
    "module": "CommonJS",
    "strict": true,
    "outDir": "."
  },
  "files": [
    "movement-and-color.ts",
  ],
}

And the NPM package.json file I'm using. You don't need this if you're using a global TypeScript installation, but I'm including it anyways:

package.json
Expand
{
  "name": "02-motion-and-color",
  "version": "1.0.0",
  "license": "MIT",
  "devDependencies": {
    "typescript": "^5.1.6"
  },
  "scripts": {
    "build": "tsc --watch"
  }
}

I use Yarn, and run the command yarn dev while I'm writing code to automatically re-compile it whenever I save my TypeScript file. The same command with NPM is npm run dev.

WebGL Helper Functions

There's a few WebGL tasks that we'll be doing several times in this app - copy and pasting code once or twice to avoid spaghetti is all well and good, but constantly copy/pasting huge chunks gets messy fast. Here's some helper functions we can use - define these at the top of your TypeScript file. I'll be re-using these functions in later tutorials, so hold onto them!

getContext (movement-and-color.ts)
Expand
function getContext(canvas: HTMLCanvasElement) {
  const gl = canvas.getContext('webgl2');
  if (!gl) {
    const isWebGl1Supported = !!(document.createElement('canvas')).getContext('webgl');
    if (isWebGl1Supported) {
      showError('WebGL 1 is supported, but not v2 - try using a different device or browser');
    } else {
      showError('WebGL is not supported on this device');
    }
    return null;
  }

  return gl;
}

Basically every WebGL app you'll write in this tutorial series will use that same code, so you might as well pull it out to keep your actual app code a touch shorter.

createStaticVertexBuffer (movement-and-color.ts)
Expand
function createStaticVertexBuffer(
    gl: WebGL2RenderingContext, data: ArrayBuffer) {
  const buffer = gl.createBuffer();
  if (!buffer) {
    showError('Failed to create buffer');
    return null;
  }
  gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
  gl.bufferData(gl.ARRAY_BUFFER, data, gl.STATIC_DRAW);
  gl.bindBuffer(gl.ARRAY_BUFFER, null);

  return buffer;
}

Most of the time that we want to create a vertex buffer, we have a chunk of data in an ArrayBuffer somewhere and can repeat this same copy-paste boilerplate. Might as well pull it into a helper function too.

FYI - there are optimizations that involve multiple pieces of geometry sharing a vertex buffer, or multiple attributes sharing a vertex buffer. Those get somewhat hard to follow pretty quick, so I will not be covering those techniques.

createProgram (movement-and-color.ts)
Expand
function createProgram(
    gl: WebGL2RenderingContext,
    vertexShaderSource: string,
    fragmentShaderSource: string) {
  const vertexShader = gl.createShader(gl.VERTEX_SHADER);
  if (!vertexShader) {
    showError('Failed to create vertex shader');
    return null;
  }
  gl.shaderSource(vertexShader, vertexShaderSource);
  gl.compileShader(vertexShader);
  if (!gl.getShaderParameter(vertexShader, gl.COMPILE_STATUS)) {
    showError(`Failed to compile vertex shader - ${gl.getShaderInfoLog(vertexShader)}`);
    return null;
  }

  const fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);
  if (!fragmentShader) {
    showError('Failed to create fragment shader');
    return null;
  }
  gl.shaderSource(fragmentShader, fragmentShaderSource);
  gl.compileShader(fragmentShader);
  if (!gl.getShaderParameter(fragmentShader, gl.COMPILE_STATUS)) {
    showError(`Failed to compile fragment shader - ${gl.getShaderInfoLog(fragmentShader)}`);
    return null;
  }

  const program = gl.createProgram();
  if (!program) {
    showError('Failed to create program');
    return null;
  }
  gl.attachShader(program, vertexShader);
  gl.attachShader(program, fragmentShader);
  gl.linkProgram(program);
  if (!gl.getProgramParameter(program, gl.LINK_STATUS)) {
    showError(`Failed to link program - ${gl.getProgramInfoLog(program)}`);
    return null;
  }

  return program;
}

Creating a WebGL program uses a LOT of boilerplate. There's something to be said for designing an API differently so that programs can share vertex shaders (which tend to not change a lot), but not for this tutorial.

Random Value Helpers

These are used to generate a random value in a range from a given minimum to a given maximum, which is helpful when generating a new shape to show:

getRandomInRange (movement-and-color.ts)
Expand
function getRandomInRange(
    min: number, max: number) {
  return Math.random() * (max - min) + min;
}

function getNewSpawnerLocation(
    canvasWidth: number, canvasHeight: number) {
  return [
    getRandomInRange(150, canvasWidth - 150),
    getRandomInRange(150, canvasHeight - 150)
  ];
}

Demo Constants

This demo will create a bunch of shapes of random type and size, all originating from a point on the screen that changes every few seconds, and all flying out in random directions. The shapes will also accelerate in random directions, to give some curved paths and a bit more visual appeal.

Putting configuration variables like minimum/maximum shape size, speed, and force is a good idea. Here's all the constants I use in this demo:

Demo Constants (movement-and-color.ts)
Expand
/** Demo constants */
const SPAWNER_CHANGE_TIME = 5;
const CIRCLE_SEGMENT_COUNT = 12;
const SPAWN_RATE = 0.08;
const MIN_SHAPE_TIME = 0.25;
const MAX_SHAPE_TIME = 6;
const MIN_SHAPE_SPEED = 125;
const MAX_SHAPE_SPEED = 350;
const MIN_SHAPE_FORCE = 150;
const MAX_SHAPE_FORCE = 750;
const MIN_SHAPE_SIZE = 2;
const MAX_SHAPE_SIZE = 50;
const MAX_SHAPE_COUNT = 250;

Times are in seconds, sizes are in pixels, speeds are in pixels/second, and "forces" (really accelerations) are in pixels/(second^2).

Notice I've also included an extra MAX_SHAPE_COUNT - this is nice to have to prevent accidentally creating a demo that spawns waaaaaaay too many shapes and slowing down a user's computer.

Generating Circle Geometry

One of the shapes you might have seen is a blue-white circle, and you may have noticed the CIRCLE_SEGMENT_COUNT. Circles are relatively easy to generate programatically by creating little triangle slices, sorta like a pizza sliced CIRCLE_SEGMENT_COUNT ways.

A couple math things:

  1. Each "slice" of a circle will have an angle of the full circle (2 * PI) divided by the number of slices.
  2. The X coordinate of any spot on a circle with radius 1 is the cosine of the angle around the circle at that spot.
  3. The Y coordinate of any spot on a circle with radius 1 is the sine of the angle around the circle at that spot.

For this shape, it'll be easy to put positions and colors right next to each other in memory, so each vertex will be structured like this: [X, Y, R, G, B].

Each triangle will have a vertex in the center of the circle, and the other two vertices will be the two "edges" along the circle at the start and end angle of that slice.

It looks and sounds like a lot of math, and it is. But the implementation isn't too bad once you understand the math behind it:

buildCircleVertexBufferData (movement-and-color.ts)
Expand
function buildCircleVertexBufferData() {
  const vertexData = [];

  // Append the vertices for each of the N triangle segments
  for (let i = 0; i < CIRCLE_SEGMENT_COUNT; i++) {
    const vertex1Angle = i * Math.PI * 2 / CIRCLE_SEGMENT_COUNT;
    const vertex2Angle = (i + 1) * Math.PI * 2 / CIRCLE_SEGMENT_COUNT;
    const x1 = Math.cos(vertex1Angle);
    const y1 = Math.sin(vertex1Angle);
    const x2 = Math.cos(vertex2Angle);
    const y2 = Math.sin(vertex2Angle);

    // Center vertex is a light blue color and in the middle of the shape
    vertexData.push(
      // Position (x, y)
      0, 0,
      // Color (r, g, b)
      0.678, 0.851, 0.957
    );
    // The other two vertices are along the edges of the circle, and a darker blue color
    vertexData.push(
      x1, y1,
      0.251, 0.353, 0.856
    );
    vertexData.push(
      x2, y2,
      0.251, 0.353, 0.856
    );
  }

  return new Float32Array(vertexData);
}

For fun, you can play with the number of segments below:

Notice that the edges get hard to see after about 40 or so slices - human eyes are pretty bad at detecting very gradual edges, especially in the presence of color information that looks smooth. Brightness and color information is much louder than actual shapes to our brains. Remember that - it'll come up later, in a tutorial about 3D lighting models!

Triangle and Square Vertices

I've just hard-coded vertex data for triangles and squares, but I am going to use 8-bit unsigned integers (0-255) for color data instead of floats. It's a bit smaller, it shows a way to store vertex data in separate buffers, and it showcases that tricky normalized parameter in vertexAttribPointer calls. Remember - when an integer attribute is normalized, the inputs are converted into percentages of the maximum integer value (for 8-bit integers, 255).

There is geometry data for a triangle (3 vertices) and a square (2 triangles, 6 vertices), along with colors for an RGB triangle, a brightly colored "firey" triangle, a square with an indigo color gradient, and a solid gray square.

These are constants that can go anywhere before our demo function

Feel free to examine each vertex and get a feel for how the data is structured, and of course feel free to mess around with these colors in your own code!

Triangle and Square Vertices (movement-and-color.ts)
Expand
const trianglePositions = new Float32Array([ 0, 1, -1, -1, 1, -1 ]);
const squarePositions = new Float32Array([ -1, 1, -1, -1, 1, -1, -1, 1, 1, -1, 1, 1 ]);
const rgbTriangleColors = new Uint8Array([
  255, 0, 0,
  0, 255, 0,
  0, 0, 255
]);
const fireyTriangleColors = new Uint8Array([
  // Chili red - E52F0F
  229, 47, 15,
  // Jonquil - F6CE1D
  246, 206, 29,
  // Gamboge - E99A1A
  233, 154, 26
]);
const indigoGradientSquareColors = new Uint8Array([
  // Top: "Tropical Indigo" - A799FF
  167, 153, 255,
  // Bottom: "Eminence" - 583E7A
  88, 62, 122,
  88, 62, 122,
  167, 153, 255,
  88, 62, 122,
  167, 153, 255
]);
const graySquareColors = new Uint8Array([
  45, 45, 45,
  45, 45, 45,
  45, 45, 45,
  45, 45, 45,
  45, 45, 45,
  45, 45, 45
]);

I've included the hex colors in comments if you're curious. There are hexadecimal to decimal converters online if you want to swap in your own colors.

MovingShape class (app logic)

I've separated out the logic to keep track of a single shape into a MovingShape class. Nothing fancy - every frame, the velocity of a shape is updated according to the random force it was given, and the position of the shape is updated according to the speed it is moving on that frame.

I'm not going to spend a lot of time covering this because it's not really graphics related, other than holding a reference to the VAO of the shape to be drawn and the number of vertices it contains.

MovingShape (movement-and-color.ts)
Expand
class MovingShape {
  constructor(
    public position: [number, number],
    public velocity: [number, number],
    public size: number,
    public forceDirection: [number, number],
    public timeRemaining: number,
    public vao: WebGLVertexArrayObject,
    public numVertices: number) {}

  isAlive() {
    return this.timeRemaining > 0;
  }

  update(dt: number) {
    this.velocity[0] += this.forceDirection[0] * dt;
    this.velocity[1] += this.forceDirection[1] * dt;

    this.position[0] += this.velocity[0] * dt;
    this.position[1] += this.velocity[1] * dt;

    this.timeRemaining -= dt;
  }
}

Notice that "update" takes in a dt parameter - this is the amount of time that has passed since the last frame, in seconds. Multiplying velocity (in pixels/second) by this term gives the number of pixels that a given shape should move. It's a good idea to do all updates based on the amount of time that has passed instead of moving by some constant amount, because otherwise people with 30 FPS monitors and people with 60 FPS monitors would have very different experiences!

I've also included an isAlive helper - a shape will disappear after some amount of time (given in the constructor as timeRemaining), this isAlive method helps with that. If I weren't lazy, I might have also had this function return true if the shape ever leaves the canvas bounds without the ability to return, but since the lines are curved and a shape could possibly disappear and re-appear... that would be too much work!

Building VAOs

Once again, there's some repeated logic here, so I've added in two more helper functions that are specific to this demo:

buildCircleVao (movement-and-color.ts)
Expand
function buildCircleVao(
    gl: WebGL2RenderingContext, buffer: WebGLBuffer,
    posAttrib: number, colorAttrib: number) {
  const vao = gl.createVertexArray();
  if (!vao) {
    showError('Failed to create interleaved VertexArrayObject');
    return null;
  }
  gl.bindVertexArray(vao);
  gl.enableVertexAttribArray(posAttrib);
  gl.enableVertexAttribArray(colorAttrib);
  gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
  gl.vertexAttribPointer(
    posAttrib,
    2, gl.FLOAT, false,
    5 * Float32Array.BYTES_PER_ELEMENT, 0);
  gl.vertexAttribPointer(
  colorAttrib,
  3, gl.FLOAT, false,
  5 * Float32Array.BYTES_PER_ELEMENT,
  2 * Float32Array.BYTES_PER_ELEMENT);
  gl.bindBuffer(gl.ARRAY_BUFFER, null);
  gl.bindVertexArray(null);

  return vao;
}

This looks like the VAO sample code above - create a VAO, enable the attributes used in it, bind the buffer containing the data, and attach vertex attributes to that buffer.

I want you to pay special attention to this line, especially the last two parameters (stride and offset, respectively)

gl.vertexAttribPointer(
  colorAttrib,
  3, gl.FLOAT, false,
  5 * Float32Array.BYTES_PER_ELEMENT,
  2 * Float32Array.BYTES_PER_ELEMENT);

Interleaving vertex attributes works great, and is the reason for the parameters "stride" and "offset" in a vertexAttribPointer call.

Stride describes the total size of a vertex in this buffer - in this case, 5 floats (2 for position and 3 for color).

Offset describes at which byte in a vertex this attribute begins. In this case, "position" takes up 2 floats before color, so skip 2 floats.

The other VAOs will include two separate buffers, one for position and one for color, so they'll be built a bit different:

buildVaoFromTwoBuffers (movement-and-color.ts)
Expand
function buildVaoFromTwoBuffers(
    gl: WebGL2RenderingContext,
    positionBuffer: WebGLBuffer, colorBuffer: WebGLBuffer,
    posAttrib: number, colorAttrib: number) {
  const vao = gl.createVertexArray();
  if (!vao) {
    showError('Failed to create parallel VertexArrayObject');
    return null;
  }

  gl.bindVertexArray(vao);
  gl.enableVertexAttribArray(posAttrib);
  gl.enableVertexAttribArray(colorAttrib);
  gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);
  gl.vertexAttribPointer(
    posAttrib, 2, gl.FLOAT, false, 0, 0);
  gl.bindBuffer(gl.ARRAY_BUFFER, colorBuffer);
  gl.vertexAttribPointer(
  colorAttrib, 3, gl.UNSIGNED_BYTE, true, 0, 0);

  gl.bindBuffer(gl.ARRAY_BUFFER, null);
  gl.bindVertexArray(null);

  return vao;
}

Similar sort of thing - except this time each attribute pulls data from its own buffer.

I want to call attention to the color attribute - remember how it defines data with unsigned 8-bit integers (0-255) but still reads into a regular float vec3 color attribute? Let's look at the vertexAttribPointer call for that in more detail:

gl.vertexAttribPointer(
  colorAttrib, 3, gl.UNSIGNED_BYTE, true, 0, 0);

Notice the type parameter is now gl.UNSIGNED_BYTE. This specifies that the data in the input buffer is an unsigned byte type - NOT that the attribute itself is that type! The attribute itself is still a float vec2 in the vertex shader.

The normalized parameter is now true, and this is where the magic happens - the bytes are interpreted as percentages across the range of possible bytes. The color bytes [255, 255, 255] would be interpreted in the shader as [1.0, 1.0, 1.0] for full values across all three channels. The color bytes [0, 127, 255] would be interpreted as [0, 0.5, 1.0] representing a color of 0%, 50%, and 100% across the red, green, and blue channels, respectively.

Graphics programming uses quite a few values that make the most sense expressed as numbers between 0 and 1, so storing those values into a 8-bit or 16-bit integer instead of a full 32-bit float can save you memory where you don't need the full precision of a float.

Shaders

One more thing before we get into the demo - I'll put the shader code in this huge messy top section too, to try to keep the actual demo function as short as possible.

I talk about everything in these shaders above, and putting it together it looks like this:

Shader Source (movement-and-color.ts)
Expand
const vertexShaderSource = `#version 300 es
precision mediump float;

in vec2 vertexPosition;
in vec3 vertexColor;

out vec3 fragmentColor;

uniform vec2 canvasSize;
uniform vec2 shapeLocation;
uniform float shapeSize;

void main() {
  fragmentColor = vertexColor;

  vec2 worldPosition = vertexPosition * shapeSize + shapeLocation;
  vec2 clipPosition = (worldPosition / canvasSize) * 2.0 - 1.0;

  gl_Position = vec4(clipPosition, 0.0, 1.0);
}`;

const fragmentShaderSource = `#version 300 es
precision mediump float;

in vec3 fragmentColor;

out vec4 outputColor;

void main() {
  outputColor = vec4(fragmentColor, 1.0);
}`;

Demo Code

Okay! Now that we have a bunch of helpers, a class to help move around an individual shape, definitions for all of our vertex data, and shader code prepared, we can write the demo function that puts everything together!

Let's get down the basic boilerplate, like mentioned above, but with a bit of extra logic to help with tracking time between frames:

movementAndColor (movement-and-color.ts)
Expand
interface Geometry {
  vao: WebGLVertexArray;
  numVertices: number;
}

function movementAndColor() {
  // SETUP HERE

  let lastFrameTime = performance.now();
  function frame() {
    const thisFrameTime = performance.now();
    const timeElapsed = (thisFrameTime - lastFrameTime) / 1000;
    lastFrameTime = thisFrameTime;

    // UPDATE HERE

    // RENDER HERE

    requestAnimationFrame(frame);
  }
  requestAnimationFrame(frame);
}

try {
  movementAndColor();
} catch (e) {
  showError(`Uncaught JavaScript exception: ${e}`);
}

The function performance.now() in JavaScript returns the number of milliseconds that have passed since... some point in time, depending on where you're running your JavaScript code. Thankfully we don't care when the timer begins, we only care how much time passes between calling it the last frame and calling it this frame!

SETUP section

With all the helper functions above, the setup is still fairly long but not awful.

First, get a WebGL context...

const canvas = document.getElementById('demo-canvas');
if (!canvas || !(canvas instanceof HTMLCanvasElement)) {
  showError('Could not find HTML canvas element - check for typos, or loading JavaScript file too early');
  return;
}
const gl = getContext(canvas);
if (!gl) {
  return;
}

... and use that WebGL context to create WebGL buffers for all the vertex data we defined above:

SETUP geometry buffers
Expand
// Create geometry buffers
const circleInterleavedBuffer =
    createStaticVertexBuffer(gl, buildCircleVertexBufferData());
const trianglePositionsBuffer =
    createStaticVertexBuffer(gl, trianglePositions);
const squarePositionsBuffer =
    createStaticVertexBuffer(gl, squarePositions);
const rgbTriangleColorsBuffer =
    createStaticVertexBuffer(gl, rgbTriangleColors);
const fireyTriangleColorsBuffer =
    createStaticVertexBuffer(gl, fireyTriangleColors);
const indigoGradientSquareColorsBuffer =
    createStaticVertexBuffer(gl, indigoGradientSquareColors);
const graySquareColorsBuffer =
    createStaticVertexBuffer(gl, graySquareColors);

if (!circleInterleavedBuffer || !trianglePositionsBuffer || !squarePositionsBuffer
    || !rgbTriangleColorsBuffer || !fireyTriangleColorsBuffer
    || !indigoGradientSquareColorsBuffer || !graySquareColorsBuffer) {
  showError('Failed to build vertex buffers!');
  return;
}

Once that's done, create the motion and color demo WebGLProgram, and get the attribute and uniform locations.

SETUP program (movement-and-color.ts)
Expand
// Create effect and get attribute+uniform handles
const motionAndColorProgram =
    createProgram(gl, vertexShaderSource, fragmentShaderSource);
if (!motionAndColorProgram) return;

const positionAttribLocation =
    gl.getAttribLocation(motionAndColorProgram, 'vertexPosition');
const colorAttribLocation =
    gl.getAttribLocation(motionAndColorProgram, 'vertexColor');
const canvasSizeUniformLocation =
    gl.getUniformLocation(motionAndColorProgram, 'canvasSize');
const shapeLocationUniformLocation =
    gl.getUniformLocation(motionAndColorProgram, 'shapeLocation');
const shapeSizeUniformLocation =
    gl.getUniformLocation(motionAndColorProgram, 'shapeSize');

if (positionAttribLocation < 0 || colorAttribLocation < 0) {
  showError(`Failed to get attributes - position=${positionAttribLocation}, color=${colorAttribLocation}`);
}
if (!canvasSizeUniformLocation || !shapeLocationUniformLocation || !shapeSizeUniformLocation) {
  showError(`Failed to get uniform locations - canvasSize=${!!canvasSizeUniformLocation} shapeLocation=${!!shapeLocationUniformLocation} shapeSize=${!!shapeSizeUniformLocation}`);
  return;
}

Notice that the API for getting uniform locations is very similar to getting attribute locations. But unlike attribute locations, uniform locations are an opaque handle (like WebGLProgram) and not just a number!

If a uniform value fails to load, gl.getUniformLocaiton will return null.

Now, with the WebGL program created and vertex buffers ready, create a VAO for each type of shape.

SETUP VAOs (movement-and-color.ts)
Expand
// Create Vertex Array Objects (VAOs) - input assembler states for each piece of geometry
const circleVao = buildCircleVao(
  gl, circleInterleavedBuffer, positionAttribLocation, colorAttribLocation);
const rgbTriangleVao = buildVaoFromTwoBuffers(
  gl, trianglePositionsBuffer, rgbTriangleColorsBuffer,
  positionAttribLocation, colorAttribLocation);
const fireyTriangleVao = buildVaoFromTwoBuffers(
  gl, trianglePositionsBuffer, fireyTriangleColorsBuffer,
  positionAttribLocation, colorAttribLocation);
const indigoGradientSquareVao = buildVaoFromTwoBuffers(
  gl, squarePositionsBuffer, indigoGradientSquareColorsBuffer,
  positionAttribLocation, colorAttribLocation);
const graySquareVao = buildVaoFromTwoBuffers(
  gl, squarePositionsBuffer, graySquareColorsBuffer,
  positionAttribLocation, colorAttribLocation);
if (!circleVao || !rgbTriangleVao || !fireyTriangleVao
    || !indigoGradientSquareVao || !graySquareVao) {
  showError(`Failed to build VAOs: circle=${!!circleVao} rgbTri=${!!rgbTriangleVao} fireyTri=${!!fireyTriangleVao} indigoSq=${!!indigoGradientSquareVao} graySq=${!!graySquareVao}`);
  return;
}

const geometryList: Geometry[] = [
  { vao: circleVao, numVertices: CIRCLE_SEGMENT_COUNT * 3 },
  { vao: rgbTriangleVao, numVertices: 3 },
  { vao: fireyTriangleVao, numVertices: 3 },
  { vao: indigoGradientSquareVao, numVertices: 6 },
  { vao: graySquareVao, numVertices: 6 }
];

To help with randomly picking a shape later, I've put all geometry into a geometryList variable as well - picking a random shape will be picking a random element from that array and using the VAO and associated number of vertices found in it.

Finally, set up the simulation state - the location of the spawner, the time until the next time a shape appears, the time until the spawner moves again, and an (empty) list of shapes:

SETUP app logic (movement-and-color.ts)
Expand
// Simulation logic data
let timeToNextSpawn = SPAWN_RATE;
let timeToNextSpawnerLocationChange = SPAWNER_CHANGE_TIME;
let spawnPosition = getNewSpawnerLocation(canvas.width, canvas.height);
let shapes: MovingShape[] = [];

Fantastic! Still here? Wow! Let's talk about what goes in the // UPDATE section next.

UPDATE section

Each frame, first decide if the shape spawner needs to be moved:

UPDATE spawner location (movement-and-color.ts)
Expand
// Update
timeToNextSpawnerLocationChange -= timeElapsed;
if (timeToNextSpawnerLocationChange < 0) {
  timeToNextSpawnerLocationChange = SPAWNER_CHANGE_TIME;
  spawnPosition = getNewSpawnerLocation(canvas.width, canvas.height);
}

The magic here is in timeToNextSpawnerLocationChange - that value is decreased by the amount of time elapsed in a frame, and when that time reaches 0 the spawner is updated and the timer is reset.

That same trick is used to decide when to spawn more shapes, but this time using a while loop instead of an if block in case multiple spawns should have happened in the last frame:

UPDATE spawn new shapes (movement-and-color.ts)
Expand
timeToNextSpawn -= timeElapsed;
while (timeToNextSpawn < 0) {
  timeToNextSpawn += SPAWN_RATE;

  const movementAngle = getRandomInRange(0, Math.PI * 2);
  const movementSpeed = getRandomInRange(MIN_SHAPE_SPEED, MAX_SHAPE_SPEED);
  const forceAngle = getRandomInRange(0, Math.PI * 2);
  const forceMagnitude = getRandomInRange(MIN_SHAPE_FORCE, MAX_SHAPE_FORCE);

  const position: [number, number] = [ spawnPosition[0], spawnPosition[1] ];
  const velocity: [number, number] = [
    Math.sin(movementAngle) * movementSpeed,
    Math.cos(movementAngle) * movementSpeed
  ];
  const force: [number, number] = [
    Math.sin(forceAngle) * forceMagnitude,
    Math.cos(forceAngle) * forceMagnitude
  ];
  const size = getRandomInRange(MIN_SHAPE_SIZE, MAX_SHAPE_SIZE);
  const timeToLive = getRandomInRange(MIN_SHAPE_TIME, MAX_SHAPE_TIME);

  const geometry = geometryList[Math.floor(Math.random() * geometryList.length)];

  const shape = new MovingShape(
    position, velocity, size, force,
    timeToLive, geometry.vao, geometry.numVertices);
  shapes.push(shape);
}

Every field for the shapes is randomly generated except for the intiial position, which is the position of the shape spawner.

Geometry is also picked randomly from the list of geometry defined above.

Once a new MovingShape instance is made, it's added to the list of shapes.

The last step in the update method is to update all the shapes in the list, and remove ones that are no longer in use.

UPDATE shapes (movement-and-color.ts)
Expand
for (let i = 0; i < shapes.length; i++) {
  shapes[i].update(timeElapsed);
}
shapes = shapes
    .filter((shape) => shape.isAlive())
    .slice(0, MAX_SHAPE_COUNT);

I've also included a .slice call to limit the length of this array to MAX_SHAPE_COUNT so so that I can play around with the other parameters as much as I want without having to worry about accidentally spawning a million shapes.

And that's it for the update section! On to the render section...

RENDER section

The render section of this demo is the shortest of the three, and that's by design! GPU APIs are designed to put as much work as possible in the loading step ("SETUP"), in order to keep the performance-critical render loop as fast as possible

Most of this code should be familiar, though there's one new API call (gl.uniformNT).

Render Logic (movement-and-color.ts)
Expand
// Render
canvas.width = canvas.clientWidth;
canvas.height = canvas.clientHeight;
gl.clearColor(0.08, 0.08, 0.08, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);

gl.viewport(0, 0, canvas.width, canvas.height);

gl.useProgram(motionAndColorProgram);
gl.uniform2f(canvasSizeUniformLocation, canvas.width, canvas.height);

for (let i = 0; i < shapes.length; i++) {
  gl.uniform2f(
    shapeLocationUniformLocation,
    shapes[i].position[0], shapes[i].position[1]);
  gl.uniform1f(shapeSizeUniformLocation, shapes[i].size);
  gl.bindVertexArray(shapes[i].vao);
  gl.drawArrays(gl.TRIANGLES, 0, shapes[i].numVertices);
  gl.bindVertexArray(null);
}

The first several lines, up through gl.useProgram(motionAndColorProgram) are all more or less what they were in the last tutorial, and will more or less be the same for the rest of this series.

To set a uniform value, a call to gl.uniformNT is made, replacing N with the number of elements in the uniform, and replacing T with the type of data to be set.

In the case of the canvasSize uniform, the uniform is a vec2, or vector of N=2 floats (f). A call to gl.uniform2f is made, passing (1) the location of the uniform to be set, and (2-3) the values of the two components that should be set in the uniform.

Uniforms are something that OpenGL (and therefore WebGL) use that are a bit outdated when moving to modern APIs like Vulkan, DirectX 12, or WebGPU. The modern concept involves uploading a uniform buffer and specifying how data in that buffer is attached to uniform values in a shader.

I'm not going to cover uniform buffers in this series, but it might be helpful for you to think of uniforms as part of a teeny-tiny buffer that the graphics driver manages for you on behalf of WebGL.

Once the per-frame canvasSize uniform is set, a for loop goes through each living shape in the shapes array, binds the uniforms for the shape's location and size, binds the VAO for that shape, issues a draw call, and then cleans up the VAO to avoid accidentally leaking VAO state.

You don't have to call bindVertexArray(null) every time in the for loop - you can un-bind it after all shapes have been drawn, or even just leave it out entirely for this demo.

I want you to build an intuition though, that any time you see gl.bindVertexArray(something), you should know where a matching gl.bindVertexArray(null or something else) call is. Save yourself the pain of accidentally corrupting geometry in a weird code path.

And... that's it! Run your app, and you should see a similar result to the one that was playing at the top of this tutorial.

Wrapping Up

That was a ton of code! If you want to see the entire source code, visit the GitHub link at the top of this page. I've also linked a live demo, and a YouTube video covering the same material as this tutorial.

If you're feeling like a bit of a challenge, try to change the code to do the following:

  1. Add more shapes with different colors to the geometryList
  2. Make shapes bounce off the walls of the demo
  3. Add another circle shape with a gradient that starts with your favorite color, and ends with the same background color as the canvas
  4. Add a uniform to the fragment shader for the time remaining in a shape, and blend the color with the canvas background color as the time remaining approaches 0.

In the next tutorial, we'll be making the jump into 3D graphics. There won't be any new WebGL API ideas, but there will be a lot more math. I hope you keep reading!