CS 424: Computer Graphics, Fall 2015
Lab 13: Life at 60fps

In this lab, you will implement John Conway's Game of Life in WebGL. This is an example of using the power of a GPU to do something that is more computational than graphical. You will use a texture to store data that is used and generated by the computation.

You will need a copy of the folder lab13-files from /classes/cs424. The starting point for the lab is the file lab13.html.

For this lab, you do not need to add any comments to your code. The lab is due in one week, on Thursday, December 9, but it will not be collected until Saturday morning.

This is our final lab. The lab period next week will be devoted to presentations of final projects.

About The Game of Life

The Game of Life (which is not really a game) is a cellular automaton. It is played on a rectangular game board that is divided into a grid of squares. Each square is called a cell. A cell can be in one of two states, "alive" or "dead". An initial configuration of living and dead cells is created on the board. After that, the game plays itself. At each time step, a new "generation" is computed, based on the current configuration of the board.

A cell has eight neighboring cells, which are next to the cell horizontally, vertically, and diagonally. (For cells along the edges of the board, we can think of the board as a torus, with the top edge connected to the bottom edge and the left edge connected to the right edge.) To compute the state of a cell in the next generation, you need to know its current state and the number of neighboring cells that are alive. The rule is as follows:

if the cell is currently dead, then {
    if the cell has exactly 3 living neighbors, then
        the cell is alive in then next generation
        the cell remains dead in the next generation
else {
    if the cell has exactly 2 or 3 living neighbors, then
        the cell remains alive in the next generation
        the cell becomes dead in the next generation

This rule is applied simultaneously to all of the cells on the board.

About the WebGL Implementation

The idea for the lab is to represent the grid of cells for the game of life as a grid of pixels. Cells are colored white if they are alive and are colored black if they are dead. The original version of lab13.html simply creates several configurations of black and white pixels in the color buffer (where WebGL does its drawing). There is a checkbox labeled "Run"; when that checkbox is checked, a function named doNextGeneration is called repeatedly. That function is supposed to compute the next generation in the Game of Life and display the result. Currently it does nothing. Your job is to implement it.

The point of the lab is to compute the next generation using the massive parallelism of the GPU. The computation will be triggered by drawing a square that fills the entire canvas. To draw the square, the GPU will call the fragment shader for each pixel. We just need the computation in the fragment shader to compute the state of that pixel in the next generation of the Game of Life.

To make that possible, the fragment shader needs access to the current Life board, so that it can check the current state of the pixel and its eight neighbors. The access can be provided by having a copy of the current board in a texture, so that the fragment shader can use texture sampling to read values from the current board.

The Texture

You will need a texture object where you can store a copy of the current board. The texture object should be created and configured in the initGL() function. To configure a texture object, you should first select a texture unit (using gl.activeTexture) and bind the texture (using gl.bindTexture). For the configuration, you should set the texture's minification and magnification filters to gl.NEAREST and you should set the S and T wrap modes to gl.REPEAT (using gl.texParameteri). The commands that you need were covered in class and are in the book.

You need gl.NEAREST for the filters in this application to avoid getting some sort of averaged value when taking samples from the texture. Using gl.REPEAT as the wrap mode will effectively connect the left edge of the texture to the right edge and the top edge to the bottom, which will give the correct values for neighboring cells for cells along the boundary of the Life board.

The remaining problem is to get a copy of the current board into the texture. WebGL has a function gl.copyTexImage2D that copies an image from the color buffer into a texture. As usual, the texture object must be bound before the function is called. The syntax is:

gl.copyTexImage2D(gl.TEXTURE_2D, 0, gl.RGB, x, y, width, height, 0);

To copy the entire image, the value for x and for y should be zero, and the width and height should be the size of the canvas (1024, unless you change it—but if you do change it, remember that the size should be a power of two). The color format used here, gl.RGB, will work for this program; gl.RGBA won't work because the color buffer that I use in the program does not have an alpha component; gl.LUMINANCE might work and would use less memory for the texture, but I haven't tested it. (The second parameter in this command is the mipmap level, which should be zero; the last parameter is the width of a border for the texture, which must be zero in WebGL.)

The Program

There is already a shader program that is used for creating the initial configurations. However, you should create a new shader program to do the next-generation computation. The new program can use the same vertex shader as the existing program, but you will have to write the fragment shader and create the program.

This will be the first time that you have created a shader program. The program should be created in the initGL() function. You will need a global variable to represent the program, and you will need the locations of the attribute variable, a_coords, and of any uniform variables that you use in the fragment shader that you write. Note that for the existing shader, I packed all of this data into an object named createWorldProg. You can either do the same thing for your program, or you can use separate global variables.

Since this application uses two shader programs, you need to use the command gl.useProgram(shaderProg) to select the shader program that will be used. Note that I have already used the function at the beginning of createInitialConfiguration(). As noted above, the operation of computing and displaying the next generation in the game will be triggered by drawing a square that fills the entire canvas. You can draw the sqaure by calling


where location is the location of the a_coords attribute variable in your shader program. This function takes care of setting up and enabling the vertex attribute pointer with the coordinates of the square.

To review, the doNextGeneration() function needs to: copy the previous generation into the texture; use your shader program; set up values for any uniform variables that you need; and render the square.

The only thing left to discuss is how to write the fragment shader program. The fragment shader will get the value of a vec2 varying variable named v_coords. This variable will hold the texture coordinates corresponding to the pixel that is being rendered. The job of the fragment shader is to decide whether that pixel is alive or dead in the next generation, and to set its color to white or black to represent its status. To do that, it need to know the state of that pixel and its neighbors in the previous generation. The information that it needs is in the texture.

So, the fragment shader will need a sampler2D variable to access information in the texture. If that sampler variable is named prevGen, then the fragment shader can use the command

vec4 color = texture2D( prevGen, v_coords );

to access the color of the pixel from the previous generation. It also needs to know the colors of the eight neighbors, and for that, it needs to get texture coordinates for each of the neighbors. Given a canvas size of 1024, the difference in texture coordinate from one pixel to a neighboring pixel is 1.0/1024.0. So, for example, the texture coordinates for the neighbor below and to the right of the pixel would be

vec2( v_coords.s + 1.0/1024.0, v_coords.t - 1.0/1024.0 )

and similarly for the other seven neighbors.