Last time we ended up with a solid but flat shaded model of our glass. The next step to making this look more realistic is to include per-pixel lighting. For our purposes we will use Phong Illumination but this (and every other lighting formula) requires that we know the normal at each point on the surface.

A normal is the vector that is perpendicular to the surface. For flat surfaces this is easy to understand but for curved surfaces it can be easier to understand it as the cross product of the two tangents to the surface (in essence the two tangents define a flat surface and the normal is then perpendicular to this). More information on normals can be found over on Wikipedia

For the specific case of our surface of revolution things are a bit easier than this. We have a 2D curve defined as a bezier curve for which we can find the 2D tangent. The tangent is found by solving the derivative of the Bezier equation. i.e we use:

and

Which is easier than it looks and mostly uses the code from the previous post on creating a surface.

But we need the vector that is perpendicular to the surface, not tangential so there is one further step before we can use this. To do this we rotate the vector anti-clockwise by 90 degrees to get the orthogonal vector.

The function itself looks something like this:

-(NSPoint) tangentOnCurve:(double) t { NSPoint ret = NSMakePoint(0,0); double bern; // iterate over all points adding the influence of each point to the tangent for( int i=0; i < [mControlPoints count]; i++ ) { bern = Basis_Derv([mControlPoints count], i, t); ret.x += [[mControlPoints objectAtIndex: i] pointValue].x * bern; ret.y -= [[mControlPoints objectAtIndex: i] pointValue].y * bern; } // we then normalize this result (make sure the vector has a length of 1) double len = sqrt( ret.x*ret.x + ret.y*ret.y ); if( len > 0.0 ) { ret.x = ret.x/len; ret.y = ret.y/len; } return ret; }

The rotation bit is being handled by lines 9 & 10 where the components are reversed to create the orthogonal vector.

This gives us a vector in 2D which we know is associated to a 2D point that is being rotated around the z-axis to create the surface of revolution. This rotation can be applied to the normal vector as well to create a final 3D vector.

Carying on from last time we need to put these normals into an array and upload them to the graphics card for use. The same as for vertices, to get the normals into graphics memory we can use:

// generate the buffer glGenBuffers(1,&normalRef); // fill the buffer glBindBuffer(GL_ARRAY_BUFFER, normalRef); glBufferData(GL_ARRAY_BUFFER, numberVerts * 3 * sizeof(float), &normals, GL_STATIC_DRAW);

And then when drawing there are a couple extra lines (highlighted below):

// let it know which buffers we will be supplying glEnableClientState(GL_VERTEX_ARRAY); glEnableClientState(GL_NORMAL_ARRAY); // let it know reference/offsets for the draw array glBindBuffer(GL_ARRAY_BUFFER, drawRef); glVertexPointer(3, GL_FLOAT, 0, 0); // let it know reference/offsets for the normals array glBindBuffer(GL_ARRAY_BUFFER, normalRef); glNormalPointer(GL_FLOAT, 0, 0); // tell it to draw the specified number of vertices glDrawArrays(GL_TRIANGLE_STRIP, 0, numberVerts); // turn stuff off again glDisableClientState(GL_VERTEX_ARRAY); glDisableClientState(GL_NORMAL_ARRAY);

If we were to color the surface using these vectors (a false colouring using the normals x,y,z as the r,g,b colour components) we get an image like the one below:

Next: Lighting

## Leave a Reply