Saturday, May 10, 2014

OpenGL|ES 2.0 on Android #2 - Polygons

GL|ES app rendering a triangle and a square.
Now that we have a blank OpenGL|ES client we can use as a starting point, it's time to put it to use. We'll not be doing anything fancy. Just baby steps.

So the next step after making a client is to draw something. Anything. We want to have something to see so we know things are working.

Fixing screen orientation

Here I'm fixing the screen to landscape orientation. To do that, you'll need to change your AndroidManifest.xml file.

<activity android:name="GLActivity" android:screenOrientation="sensorLandscape" android:label="@string/app_name">
        <action android:name="android.intent.action.MAIN"/>
        <category android:name="android.intent.category.LAUNCHER"/>

I've added android:screenOrientation to fix the orientation to landscape in both directions.

The Shader class

package com.example.gles;

import android.opengl.GLES20;

public class Shader {
    public static int loadShader(int type, String shaderCode) {
        int shader = GLES20.glCreateShader(type);
        GLES20.glShaderSource(shader, shaderCode);

        return shader;

    public static int createProgram(int vertexShader, int fragmentShader) {
        int program = GLES20.glCreateProgram();
        GLES20.glAttachShader(program, vertexShader);
        GLES20.glAttachShader(program, fragmentShader);

        return program;

I created a small Shader class to handle shader related tasks. This won't be a big class. It's just loading shader code you pass into it. loadShader returns a shader whether vertex or fragment depending on the parameters passed. createProgram links a vertex and a fragment shader together.

Save the code in

Geometry Engine class

This is where we process geometry data. When onDrawFrame in GLActivity gets called it will basically use this class to process whatever it is we want to render to screen.

package com.example.gles;

import android.opengl.GLES20;
import android.opengl.Matrix;

import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;

public class GeometryEngine {
    private final String vertexShader =
            "attribute vec4 vPosition;" +
            "uniform mat4 uMVPMatrix;" +
            "void main() {" +
            "   gl_Position = uMVPMatrix * vPosition;" +
    private final String fragmentShader =
            "precision mediump float;" +
            "void main() {" +
            "   gl_FragColor = vec4(1.0f, 1.0f, 1.0f, 1.0f);" +

    private static float triangleGeometry[] = {     // in counterclockwise order
            0.0f, 0.5f, 0.0f,                       // top
            -0.5f, -0.5f, 0.0f,                     // bottom left
            0.5f, -0.5f, 0.0f                       // bottom right
    private static float squareGeometry[] = {
            -0.5f, 0.5f, 0.0f,                      // top left
            -0.5f, -0.5f, 0.0f,                     // bottom left
            0.5f, 0.5f, 0.0f,                       // top right
            0.5f, -0.5f, 0.0f,                      // bottom right
    private static final int POSITION_PER_VERTEX = 3;
    private static final int VERTEX_STRIDE = POSITION_PER_VERTEX * 4; // Size per-vertex in bytes
    private static final int TRIANGLE_VERTEX_COUNT = triangleGeometry.length / POSITION_PER_VERTEX;
    private static final int SQUARE_VERTEX_COUNT = squareGeometry.length / POSITION_PER_VERTEX;

    private int shaderProgram;
    private FloatBuffer triangleBuffer;
    private FloatBuffer squareBuffer;

    public GeometryEngine() {

    private void initGeometry() {
        triangleBuffer = createFloatBuffer(triangleGeometry.length * 4);

        squareBuffer = createFloatBuffer(squareGeometry.length * 4);

    private void initShaders() {
        int vertexShader = Shader.loadShader(GLES20.GL_VERTEX_SHADER, this.vertexShader);
        int fragmentShader = Shader.loadShader(GLES20.GL_FRAGMENT_SHADER, this.fragmentShader);
        shaderProgram = Shader.createProgram(vertexShader, fragmentShader);

    public void draw(final float[] mvpMatrix) {

        int positionHandle = GLES20.glGetAttribLocation(shaderProgram, "vPosition");
        GLES20.glVertexAttribPointer(positionHandle, POSITION_PER_VERTEX, GLES20.GL_FLOAT, false, VERTEX_STRIDE, triangleBuffer);

        float[] scratch = new float[16];
        Matrix.translateM(scratch, 0, mvpMatrix, 0, 1.0f, 0.0f, 0.0f);
        int matrixHandle = GLES20.glGetUniformLocation(shaderProgram, "uMVPMatrix");
        GLES20.glUniformMatrix4fv(matrixHandle, 1, false, scratch, 0);

        GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, TRIANGLE_VERTEX_COUNT);     // Draw triangle

        GLES20.glVertexAttribPointer(positionHandle, POSITION_PER_VERTEX, GLES20.GL_FLOAT, false, VERTEX_STRIDE, squareBuffer);
        Matrix.translateM(scratch, 0, mvpMatrix, 0, -1.0f, 0.0f, 0.0f);
        GLES20.glUniformMatrix4fv(matrixHandle, 1, false, scratch, 0);

        GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, SQUARE_VERTEX_COUNT);  // Draw square

     *  Creates a FloatBuffer used to store vertex data (eg. position, color, etc,..)
     * @param size Size of data to be stored.
     * @return FloatBuffer object.
    private FloatBuffer createFloatBuffer(final int size) {
        return ByteBuffer.allocateDirect(size)

This is the biggest class so far.

initGeometry initialises geometry data by creating buffers for storing vertex related data, then copying the data into said buffers. To keep things simple, data is defined in member variables.

initShaders makes use of the Shader class to initialise vertex and fragment shaders we'll be using when rendering our scene.

draw does exactly what it says. It handles rendering our scene using geometry data and shaders we've previously initialised in initGeometry and initShaders. It accepts a float array representing the combined model view projection matrix and uses that to transform the scene. This gets called from onDrawFrame in our renderer.

The rendering can be described simply:
  1. Set desired GL states and upload data to GL.
  2. Render the scene.
  3. Do clean up work.
So, we start with glUseProgram to tell OpenGL to use the shader program we compiled and linked earlier when calling initShaders.

Next, we upload the vertex data by getting a handle to the variable in our shader using glGetAttribLocation, enabling it with glEnableVertexAttribArray and finally telling it where to get the data from with glVertexAttribPointer. Whether uploading vertex position data, normals, colors, etc.. we'll be using this same basic pattern.

Uniforms are a bit different. We use glGetUniformLocation to get a handle to our matrix, then upload data with glUniformMatrix4fv. No enabling/disabling required. In the above code, I prepare a translation matrix first since I actually want to render 2 separate shapes to screen so I want to push one shape to the left and the other to the right.

When you're ready to draw, call glDrawArrays to request a render.

Finally, re-upload vertex position data (this time for the square shape), upload a matrix with different translation values (pushing the square to the right), and render once more with glDrawArrays. When you're finished rendering, disable any vertex attribute arrays using glDisableVertexAttribArray.

createFloatBuffer does just what it says. It returns a FloatBuffer of the specified size. This gets used during initGeometry to reduce the lines of code in there so it's easier to read.


GLActivity will need some changes to make use of the new code, of course.

import android.opengl.GLES20;
import android.opengl.GLSurfaceView;
import android.opengl.Matrix;        // Added
import android.os.Bundle;

A new import is needed.

private GeometryEngine geometry;
private final float[] viewMatrix = new float[16];
private final float[] projectionMatrix = new float[16];
private final float[] mvpMatrix = new float[16];

All new member variables to hold the geometry engine and various matrices.

GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);    // Black background

geometry = new GeometryEngine();        // Added

OnSurfaceCreated has a new line to initialise the geometry engine object.

GLES20.glViewport(0, 0, width, height);

// Added
final float ratio = (float) width / height;
Matrix.frustumM(projectionMatrix, 0, -ratio, ratio, -1, 1, 3, 7);

OnSurfaceChanged now initialises the projection matrix.


// Added
Matrix.setLookAtM(viewMatrix, 0, 0, 0, -3, 0.0f, 0.0f, 0.0f, 0.0f, 1.0f, 0.0f);

Matrix.multiplyMM(mvpMatrix, 0, projectionMatrix, 0, viewMatrix, 0);

In OnDrawFrame we initialise the view matrix with setLookAt, combine the projection and view matrices together, and pass that to the geometry engine's draw method to use for rendering purposes.

If everything is in order, you should be able to get the screenshot at the very top of this post.