Sorry, but this section is just an outline of what it should be. The intention was to introduce aspects of the shader language, GLSL, that were not covered in previous sections and to use them to write some sample shaders. The section does discuss some examples; one does basic lighting and material while the others show how to use a cubemap texture.

Note that there are various versions of GLSL. When I say "GLSL" in this section, I mean the WebGL version. GLSL for WebGL is essentially the same as GLSL ES 1.0, which is used with OpenGL ES 2.0. This version has some significant differences from versions used with non-ES OpenGL.

For more information, you can consult the GLSL ES 1.0 spec, or see the WebGL reference card for a brief summary.

I have already noted that the basic simple data types in GLSL are
*bool*, *int*, and *float* and that there are
vector types *vec2*, *vec3*, and *vec4* representing
vectors of floats. There are also vector types representing vectors
of bools and vectors of ints, but I have not found a use for them.
And there are matrix types *mat2*, *mat3*, and *mat4*
representing square matrices of floats (but no matrix types for
ints or bools). Finally, there are the types *sampler2D*
and *samplerCube*, which are used for working with textures.

It is worth emphasizing that GLSL does not have pointers or objects
and it does not use a *new* operator. For example, when you declare
a variable of type *vec4*, you get memory space for four floating
point numbers rather than a pointer, as you would in Java.

GLSL uses what it calls constructors to create
values. A constructor is basically a function whose name is that same as the name
of a data type such as *vec4* or *mat3*. As parameters, you have
to provide enough data to exactly fill the value that is being created,
but GLSL is flexible about how the data can be provided. For example,
you can construct a *vec4* from four *floats*, from a *vec3*
and one *float*, from two *vec2s*, and so on. And you can,
for example, construct a *mat4* from 16 *floats* or from
four *vec4s*.

The components of a vector can be referred to using array notation,
such as *vector*`[1]`, or using subscripts, such as *vector.y*
or *vector.g*. An unusual feature of GLSL is that you can use multiple
subscripts to extract several components of a vector. For example,
if *vector* is a *vec4*, then *vector.xyz* is a *vec3*
consisting of the first three components of *vector*. You can repeat
subscripts and put them in any order you want. So, *vector.xxx* is
a *vec3* consisting of three copies of *vector.x*, and *vector.wzyx*
is a *vec4* containing the components of *vector* in reverse order.
You can even use multiple subscripts on the left side of an assignment
statement:

vector.xy = vec2(3.14,2.72);

(Using multiple subscripts out of order is called swizzling,
and a notation such as *vector.yx* is called a swizzler.)

For control structures, GLSL has *if* statements and *for* loops. Furthermore,
only *for* loops of a limited kind are allowed. Essentially only counting loops are
allowed, where the initializer initializes a variable to a constant, the update adds a constant to
or subtracts a constant from the variable, and the continuation test compares the variable
to a constant. Continue and break statements can be used in for loops.

(GLSL ES 1.0, the version on which GLSL for WebGL is based, allows while loops and more general for loops as optional extensions, but the documentation for WebGL says that WebGL shaders cannot use any of the optional extensions.)

Functions in WebGL look much like they do in C or in Java (without modifiers like
public or static). That is, a function definition consists of a return type, a
function name, a list of parameters, and the function body enclosed between { and }.
A function that does not return a value uses return type *void*.
A *return* statement can be used to end a function and return a value. In a *void*
function, a *return* with no value can be used to end the function early.
Recursion is not allowed. A function must be declared before it can be used. (You can
declare a function by giving its definition or by providing a prototype,
which looks like a function definition with the body replaced by a semicolon. If you provide
a prototype, you must still provide a definition later.)

Parameters in a function definition can be modified by adding *in*, *out*,
or *in out*. Parameters marked *in* are used for input to the function,
and the value of the actual parameter is copied into the formal parameter when the function is
called; this is the default. Parameters marked *out* are used for output from the
function, and the value of the formal parameter is copied into the actual parameter when
the function returns. Parameters marked *in out* are used for both input and output,
and the value is copied both when the function is called and when it returns.
As a useless example,

void mkvec(in float a, in float b, out vec2 v, in out count) { v = vec2(a,b); count++; }

For data structures, GLSL has arrays and structs. Only one dimensional arrays are allowed, and a constant array size must be provided when an array variable is declared. For example,

float data[5]; // An array of 5 floats vec3 vectors[3]; // An array of 3 vec3s lightProperties light[4]; // An array of 4 values of type lightProperties

Note that an array variable is not a pointer to an array; it represents memory space for all the elements of the array. Note that when an array is passed to a function as a parameter, the size of the array must be specified; for example,

float arraySum(float values[10]) { ...

A struct contains a set of named data fields. A struct in GLSL is mostly the same as a struct in C, but there are some restrictions. It is similar to a class in Java, but containing data members only and no functions. For example, here is a struct that I will use to hold the properties of a light source:

struct lightProperties { vec4 position; // position.z == 0 for a directional light. vec3 intensity; // This is the color, but not restricted to the range 0 to 1. vec3 ambient; // Amount added to global ambient when this light is enabled. bool enabled; // Is this light turned on? };

This defines *lightProperties* as a data type that can then be
used to declare variables, parameters, and return types. Note that you can
create arrays of structs, and that structs can contain arrays.

GLSL has most of the same operators as C or Java. Notably missing are
type-casting, the % operator for integer remainder, and the bitwise logical
operators `&`, `|`, `^`, and `~`. The operators
can be applied to vectors and matrices, not just to simple types.
For example, `+` can be used to add vectors and `*` can be
used to multiply matrices or to multiply a matrix times a vector.
All of the arithmetic operations can be applied with a vector as one operand and a float as the other;
the operation is applied to each component of the vector. For
example, `vec3(1.0,2.0,3.0)/2.0` gives the vector `(0.5,1.0,1.5)`.

GLSL has all the usual math functions, including *sin*, *cos*,
*tan*, *asin*, *acos*, *atan*, *pow*, *exp*, *log*,
*sqrt*, *abs*, *floor*, *ceil*, *min*, and *max*.
There is a function *mod*(*a,b*) that computes the remainder when *a*
is divided by *b*. (This is similar to but not quite the same as the `%` operator.)
These functions require floats as parameters and return floats as their result (even *floor*,
*ceil*, and *mod*). Note that the functions can be applied to vectors and in that case apply
to each component of the vector, giving a vector as the result.

GLSL also has many built-in functions that are useful for graphics programming and that are
not defined in most programming languages. For example, *mix*(a,b,t) returns the "linear blend" of *a* and
*b* defined by `t*a+(1-t)*b`. GLSL has a number of functions for working with vectors,
including

length(v) -- returns the length of the vector v normalize(v) -- returns the unit vector equal to v / length(v) dot(v,w) -- returns the dot product of v and w cross(v,w) -- returns the cross product of v and w (for vec3 only) reflect(v,N) -- returns v reflected by the (normal) vector N

The last of these can be used with a cubemap texture for reflection mapping.

I should also mention two functions that are used with sampler variables to
"sample" textures. Recall that a *sampler2D* variable represents a texture image.
If *smplr* is a variable of type *sampler2D* and
*texCoords* is of type *vec2*, then

texture2D( smplr, texCoords )

returns a *vec4* containing the color from the texture image at
the point (*texCoords.x,texCoords.y*). The texture coordinates would
generally be derived from an attribute variable. We have already seen an
example of this in the previous section.

WebGL also supports cubemap textures, which we have seen used for
skyboxes and environment mapping in Three.js, in Section 16.
A GLSL variable of type *samplerCube* is used to sample a color from a cube map texture.
If *smplrC* is the *samplerCube* and *R* is a *vec3*, which should be normalized
to have length equal to one, then

textureCube( smplrC, R )

returns a sample from the cubemap texture in the direction *R*. That is, imagining the
cubemap texture to be assigned to an actual cube, place a ray extending in the direction of *R* from the
center of the cube, and select the color of the point where that ray intersects the cube. For reflection
mapping, the texture color for a point on a surface is obtained by reflecting the direction to the
viewer from the surface, using the normal vector. That reflected direction is used as the
*vec3* *R* to sample the color from a cubemap texture. It looks as though the cubemap
texture is being reflected by the surface.

The sample program webgl-light-and-material.html
is demo of using lighting and material in WebGL. The shader program defines a struct, *lightProperties*,
to represent the properties of a light source and another struct, *materialProperties*,
to represent material properties. An array of four *lightProperties* allows for up to four light
sources in a scene. Two variables of type *materialProperties* are defined, to allow for different
materials on front and back faces. Lighting calculations are done in the fragment shader,
where I use a function to compute the lighting equation. Here is the fragment shader from that
program:

precision mediump float; struct materialProperties { // A type that defines the set of material properties. vec3 ambient; vec3 diffuse; vec3 specular; vec3 emissive; float shininess; }; struct lightProperties { // A type that defines the set of light properties. vec4 position; // position.z == 0 for a directional light. vec3 intensity; // This is the color, but not restricted to the range 0 to 1. vec3 ambient; // Amount added to global ambient when this light is enabled. bool enabled; }; uniform materialProperties frontMaterial; // material for front faces (and for back faces if twoSided is false) uniform materialProperties backMaterial; // material for back faces, used only if twoSided is true materialProperties material; // This is the material that will actually be used on this fragment uniform bool twoSided; // If true, back faces will have a different material from front faces uniform mat3 normalMatrix; // Matrix for transforming the normal vector. uniform lightProperties light[4]; // data for four lights uniform bool lit; // If false, no lighting is done; // instead the unmodified diffuse material color is used. uniform vec3 globalAmbient; // Amount of global ambient illumination, independent of the four lights. varying vec3 viewCoords; // position in viewing coordinates, used for lighting. varying vec3 vNormal; // The interpolated normal vector. vec3 lighting(vec3 vertex, vec3 V, vec3 N) { // A function to compute the color of this fragment using the lighting // equation. vertex contains the coords of the points; V is a unit // vector pointing to viewer; N is the normal vector. This function // also uses the values in the global variables material and // light[0]..light[3]. vec3 color = material.emissive + material.ambient * globalAmbient; for (int i = 0; i < 4; i++) { if (light[i].enabled) { color += material.ambient * light[i].ambient; vec3 L; if (light[i].position.w == 0.0) L = normalize( light[i].position.xyz ); else L = normalize( light[i].position.xyz/light[i].position.w - vertex ); if ( dot(L,N) > 0.0) { vec3 R; R = (2.0*dot(N,L))*N - L; color += dot(N,L)*(light[i].intensity*material.diffuse); if ( dot(V,R) > 0.0) color += pow(dot(V,R),material.shininess) * (light[i].intensity * material.specular); } } } return color; } void main() { if (lit) { vec3 tnormal = normalize(normalMatrix*vNormal); if (!gl_FrontFacing) tnormal = -tnormal; if ( gl_FrontFacing || !twoSided) material = frontMaterial; else material = backMaterial; gl_FragColor = vec4( lighting(viewCoords, normalize(-viewCoords),tnormal), 1.0 ); } else { if ( gl_FrontFacing || !twoSided ) gl_FragColor = vec4(frontMaterial.diffuse, 1.0); else gl_FragColor = vec4(backMaterial.diffuse, 1.0); } }

Note that the light and material properties are *uniform* variables in
the fragment shader, since their values have to come from the JavaScript side. Now,
JavaScript only has functions for setting uniform values that are simple variables,
vectors, or matrices. There are no functions for setting the values of structs or
arrays. The solution to this problem requires treating every component of a
struct or array as a separate uniform value. For example, to work with a
uniform variable of type *materialProperties*, we need five variables in
JavaScript to represent the locations of the five components of the
uniform variable. And to set the value of the variable, we have to set
each component separately, with a call to one of the *glUniform** functions. For
the array of four *lightProperties* that is used in this shader, we need a total
of 16 uniform locations.

When using *gl.getUniformLocation* to get the location of a uniform from
a shader program, you need the name of the uniform. When working with uniform structs
and arrays, the name that you need is the full name of the component whose location
you are interested in. For example, to get the location of the *position* component
of element number 2 of the *light* array in the above shader, we could use

loc = gl.getUniformLocation( prog, "light[2].position" );

In fact, to manage all the uniform locations that are needed in the
sample program, I store the locations in data structures that have the same basic
form as the data structures in the shader program. In JavaScript, I use a global
variable *uLight* to refer to an array of objects holding locations of light
properties, and I use global variables *uFrontMaterial* and *uBackMaterial*
to refer to objects holding the locations of material properties. The following
code is used to initialize those JavaScript variables and to initialize the values
of the corresponding uniform variables in the shader program:

uFrontMaterial = {}; // locations of front material properties uFrontMaterial.ambient = gl.getUniformLocation(prog,"frontMaterial.ambient"); uFrontMaterial.diffuse = gl.getUniformLocation(prog,"frontMaterial.diffuse"); uFrontMaterial.specular = gl.getUniformLocation(prog,"frontMaterial.specular"); uFrontMaterial.emission = gl.getUniformLocation(prog,"frontMaterial.emissive"); uFrontMaterial.shininess = gl.getUniformLocation(prog,"frontMaterial.shininess"); uBackMaterial = {}; // locations of back material properties uBackMaterial.ambient = gl.getUniformLocation(prog,"backMaterial.ambient"); uBackMaterial.diffuse = gl.getUniformLocation(prog,"backMaterial.diffuse"); uBackMaterial.specular = gl.getUniformLocation(prog,"backMaterial.specular"); uBackMaterial.emission = gl.getUniformLocation(prog,"backMaterial.emissive"); uBackMaterial.shininess = gl.getUniformLocation(prog,"backMaterial.shininess"); uLight = []; // locations of light properties for (var i = 0; i < 4; i++) { uLight[i] = {}; // locations of properties of light number i uLight[i].position = gl.getUniformLocation(prog,"light[" + i + "].position"); uLight[i].intensity = gl.getUniformLocation(prog,"light[" + i + "].intensity"); uLight[i].ambient = gl.getUniformLocation(prog,"light[" + i + "].ambient"); uLight[i].enabled = gl.getUniformLocation(prog,"light[" + i + "].enabled"); } gl.uniform3f(uFrontMaterial.ambient, 0.1, 0.1, 0.1); // Set default front material. gl.uniform3f(uFrontMaterial.diffuse, 0.6, 0.6, 0.6); gl.uniform3f(uFrontMaterial.specular, 0.3, 0.3, 0.3); gl.uniform3f(uFrontMaterial.emission, 0, 0, 0); gl.uniform1f(uFrontMaterial.shininess, 50); gl.uniform3f(uBackMaterial.ambient, 0.1, 0.1, 0.1); // Set default back material. gl.uniform3f(uBackMaterial.diffuse, 0.3, 0.6, 0.6); gl.uniform3f(uBackMaterial.specular, 0.3, 0.3, 0.3); gl.uniform3f(uBackMaterial.emission, 0, 0, 0); gl.uniform1f(uBackMaterial.shininess, 50); for (i = 0; i < 4; i++) { // Set default light properties. gl.uniform4f(uLight[i].position, 0, 0, 1, 0); gl.uniform3f(uLight[i].ambient, 0, 0, 0); if (i == 0) { gl.uniform3f(uLight[i].intensity, 1, 1, 1); gl.uniform1i(uLight[i].enabled, 1); } else { gl.uniform3f(uLight[i].intensity, 0, 0, 0); gl.uniform1i(uLight[i].enabled, 0); } }

I have written a set of examples that do skyboxes and reflection mapping in WebGL. (Reflection mapping is also called environment mapping.) The examples show how to load and use a cubemap texture. I won't discuss these examples here, but you can look at the source code to see how it's done. The examples, which can be found in the directory webgl/skybox-and-reflection, are: