The second and last test will be given in class on Friday, November 20. It covers Chapter 4, Chapter 5, and the first three sections of Chapter 6. There might also be something about basic concepts from Blender.
The format of the test will be the usual: some essay questions and definitions; reading some code and explaining its purpose; writing some code. There will not be a lot of code writing. You do not need to memorize every function and method that we have encountered, but you should should be able to write code using those that are listed below, except those that are listed as requiring only a "reading knowledge".
Here are some terms and ideas that you should be familiar with:
Light and material in OpenGL 1.0
material properties:
ambient color,
diffuse color,
specular color,
emission color,
shinininess (also called specular exponent)
color represents the fraction of incident light that is reflected by a surface
light properties:
position,
diffuse intensity,
specular intensity
global ambient light
directional light from direction (x,y,z), position = (x,y,z,0)
point light at (x,y,z), position = (x,y,z,1)
normal vectors
vectors
operations on vectors: dot product, cross product, length
unit vector (length = 1)
the dot product of two unit vectors gives the angle between the vectors
the lighting equation: how visible color of a point on a surface is computed
how emission material color contributes to the color
how ambient material color and ambient light contribute to the color
how light angle and normal vector affect diffuse reflection
how light angle, normal, direction to viewer affect specular reflection
the effect of the "shininess" material property
textures
image textures
texture coordinates, and how they are used on a primitive
texture repeat modes: how are texcoords outside the range 0 to 1 are treated
minification and magnification filtering of image textures
mipmaps
how and why mipmaps are used for improved minification filtering
texture transformation
how texture transformations affect the appearance of the texture on a surface
setting texture coordinates: glTexCoords2f(s,t)
setting normal vectors: glNormal3f(x,y,z)
[ might also need: gl.Begin(primitive), gl.End(), gl.Vertex3f(x,y,z) ]
a READING knowledge of the OpenGL API for light and material:
glEnable(gl_LIGHTING), glEnable(GL_LIGHT0), glEnable(GL_LIGHT1), ...
glMaterialfv( side, property, value )
side is GL_FRONT_AND_BACK, GL_FRONT, GL_BACK
property is GL_AMBIENT, GL_DIFFUSE, GL_AMBIENT_AND_DIFFUSE,
GL_SPECULAR, GL_EMISSION
value is an array of 4 numbers (or a pointer)
in JOGL, there is an extra integer parameter, usually 0
glMateriali( side, property, value )
property is GL_SHININESS, value is 0 to 128
glLightfv( light, property, value )
light is GL_LIGHT0, GL_LIGHT1, ...
property is gl_POSITION, gl_DIFFUSE, gl_SPECULAR, gl_AMBIENT
value is an array of 4 numbers (or a pointer)
in JOGL, there is an extra integer parameter, usually 0
Three.js: A 3D scene graph API for WebGL
basic requirements for rendering an image: scene, camera, renderer
the basic rendering command: renderer.render( scene, camera )
the class THREE.Object3D represents scene graph nodes, with properties
obj.add(object) -- adds an object as a child of obj
obj.clone() -- make a copy, including copying transformations
obj.position (translation)-- obj.position.x,obj.position.y,obj.position.z
obj.scale (scaling)-- obj.scale.x, obj.scale.y, obj.scale.z
obj.rotation (rotations)-- obj.rotation.x, obj.rotation.y, obj.rotation.z
applying transformations using obj.position, obj.rotation, obj.scale
building hierarchical structures with Three.js
geometry and material (required for a visible object)
creating a mesh object with new Three.Mesh( geometry, material )
Lambert material vs. Phong material
a READING knowledge of basic constructors from the three.js API:
new THREE.Scene() -- the root node of the scene graph
new THREE.PerspectiveCamera(fovy,aspect,near,far)
new THREE.PointLight( color ); // at (0,0,0)
new THREE.DirectionalLight( color ); // shining from (0,1,0)
new THREE.Mesh( geometry, material )
new THREE.BoxGeometry( sizeX, sizeY, sizeZ )
new THREE.SphereGeometry( radius, slices, stacks )
new THREE.CylinderGeometry( topRadius, bottomRadius, height, slices )
new MeshLambertMaterial({ color: c })
new MeshPhongMaterial({ color: c, specular: s })
WebGL
OpenGL 1.0 uses a fixed-function pipeline
WebGL uses a programmable pipeline
why does modern OpenGL use a programmable pipeline?
shader programs
vertex shader
fragment shader
the flow of data in the programmable pipeline
uniform variable
attribute variable
varying variable
uniform and attribute variable values come from JavaScript
values for varying variables in fragment shader are interpolated from vertices
JavaScript API for uniform variables:
gl.getUniformLocation(prog, uniformName)
gl.uniform*(uniformLocation, value...)
Typed arrays: Float32Array, Uint16Array, Uint8Array
a READING knowledge of the JavaScript API for attribute variables:
gl.createBuffer()
gl.bindBuffer(gl.ARRAY_BUFFER, buffer)
gl.bufferData(buffer, typedArray, gl.STATIC_DRAW) // or gl.STREAM_DRAW
gl.getAttribLocation(prog, attributeName)
gl.enableVertexAttribArray(attributeLocation)
gl.vertexAttribPointer(attributeLocation, numsPerVertex, type, false, 0, 0)
gl.drawArrays( primitive, startVertex, vertexCount )
primitives include gl.POINTS, gl.LINES, gl.TRIANGLES
the difference between gl.STREAM_DRAW and gl.STATIC_DRAW
buffers are Vertex Buffer Objects (VBO)
why VBOs are used for attribute and index values
the importance of limiting data exchange between CPU and GPU
GLSL shader programming language
basic GLSL:
the entry point to a shader: void main() { ... }
basic types: int, float, vec2, vec3, vec4
constructors such as vec4( color, 1.0 ) or vec3(1,1,1)
referring to vector components as v.x, v.y, v.z, v.w
declaring and using uniform, attribute, and varying variables
vertex shader assigns a value to gl_Position (of type vec4)
fragment shader assigns a value to gl_FragColor (of type vec4)
control strictures: if statement and basic for loop
some basic ideas from Blender
parenting
tracking
edit mode
procedural textures
image textures
extruding a mesh object
modifiers: subsurface modifier / displace modifier
keyframes and keyframe animation
particle systems: forces and velocities / halo materials
path animation