Tutorials

-

As a graphics programmer when you first hear about shaders they seem magical. Write tiny bits of code in this weird language and they make crazy effects possible at lightning speed. Then you look at some shader code and get lost. Go browse ShaderToy sometime. It’s awesome but daunting. And too few blogs talk about how to actually make these pixel effects from scratch. This blog post won’t talk about that either, but we will learn how to extend the default ThreeJS shader to take a few baby steps in that crazy magic shader direction.

This article is part of my ongoing series of medium difficulty ThreeJS tutorials. I’ve long wanted something in between the intro “How to draw a cube” and “Let’s fill the screen with shader madness” levels. So here it is.

Here’s a picture of what we want to create:

Live demo

Believe it or not, this is just a normal sphere with the vertices pushed around a bit. Click on the demo link to see it in action. Crazy stuff. And yet the core is just one line of code.

``````transformed.x = position.x + sin(position.y*10.0 + time*10.0)*0.1;
``````

The hard part is providing that one line of code with everything else it needs to work.

## 15 Second overview of Shaders

In OpenGL and really any modern graphics API there is a process for getting pixels on the screen. Suppose you want to draw a sphere. The geometry of the sphere, made of three dimensional points called vertexes or vertices, are generated on the CPU then sent to the GPU. There’s no point sending the same geometry on each frame, so the GPU stores this in a buffer.

On every frame the GPU runs a tiny program called a vertex shader on every single vertex in the geometry. This tiny program calculates the final position of vertexes on the screen then sends them on to a second tiny program called a fragment shader. The fragment shader is run on every single fragment (pixel) of the final picture on screen. This tiny program calculates the final color of each pixel based on textures, lighting, and other settings.

If we want we can modify the vertex shader to move the points of the sphere around a little before sending them on to the fragment shader. That’s what we will do today.

The challenge is that the default materials ThreeJS uses have rather complicated shaders. We don’t want to replace this code since it handles useful things like lighting and transforms. Instead we want to just modify a tiny piece of it and keep the rest intact.

The MeshLambertMaterial is one of the standard materials ThreeJS comes with. It is composed of two shaders which are themselves composed of many conditional pieces that are assembled into one final program. This is one example. Pretty yucky stuff.

In a simple program with the MeshLambertMaterial a sphere looks like this:

Let’s create a material then start modifying the contents of the vertex shader.

``````const mat = new THREE.MeshLambertMaterial({
color:’green’,
transparent:true,
opacity: 0.5
})
``````

The built in materials provide an `onBeforeCompile` function that gives us the raw shader. We can then modify it before final assembly.

``````mat.onBeforeCompile = (shader) => {
const token = ‘#include <begin_vertex>’
const customTransform = `
vec3 transformed = vec3(position);
transformed.x = position.x + position.y/20.0;
`
}
``````

The final vertex shader is composed of many pieces identified with tokens like `#include <begin_vertex>`. Normally the `begin_vertex` token will be replaced by something which creates the `transformed` point from the original `position` point. The code above replaces the `begin_vertex` token with our own code which modifies the x coordinate of each point. That’s the only thing we need to change to get an effect.

With this custom material the sphere now looks like this.

Pretty cool. The x coordinate of every vertex is shifted to give it a shear effect. Now lets try something more interesting. Let’s apply a sine wave.

Change the main line of the shader modification from this:

``````transformed.x = position.x + position.y/20.0;
``````

to this

``````transformed.x = position.x + sin(position.y*10.0)*0.1;
``````

Now the sphere looks like this:

Woah! That’s a big change. Now let’s make the effect animate over time. We need to add a time uniform to the shader, then use it in the equation.

``````mat.onBeforeCompile = (shader) => {
uniform float time;
const token = ‘#include <begin_vertex>’
const customTransform = `
vec3 transformed = vec3(position);
transformed.x = position.x
+ sin(position.y*10.0 + time*10.0)*0.1;
`
}
``````

This is the same code as before but with the addition of a time uniform. It must be declared on the shader object itself with `shader.uniforms.time` and also declared in the vertex shader code as `uniform float time`. Then we can use it in the equation, which now modifies x using the sine of y plus time.

All of the code above makes the shader react to a changing time. Now we just need to actually make time change. In your `render` function that is called every frame, add a line like this:

``````function render(time) {
renderer.render( scene, camera );
}
``````

Now the waves will smoothly move down the sphere.

It makes more sense if you look at the demo.