Sketchy Pencil Effect with Three.js Post-Processing

In this tutorial, you’ll learn how to create a sketchy, pencil effect using Three.js post-processing. We’ll go through the steps for creating a custom post-processing render pass, implementing edge detection in WebGL, re-rendering the normal buffer to a render target, and tweaking the end result with generated and imported textures.

This is what the end result looks like, so let’s get started!

Post-processing in Three.js

Post-processing in Three.js is a way of applying effects to your rendered scene after it has been drawn. In addition to all the out-of-the-box post-processing effects provided by Three.js, it is also possible to add your own filters by creating custom render passes.

A custom render pass is essentially a function that takes in an image of the scene and returns a new image, with the desired effect applied. You can think of these render passes like layer effects in Photoshop—each applies a new filter based on the previous effect output. The resulting image is a combination of all the different effects.

Enabling post-processing in Three.js

To add post-processing to our scene, we need to set up our scene rendering to use an EffectComposer in addition to the WebGLRenderer. The effect composer stacks the post-processing effects one on top of the other, in the order in which they’re passed. If we want our rendered scene to be passed to the next effect, we need to add the RenderPass post-processing pass is passed first.

Then, inside the tick function which starts our render loop, we call composer.render() instead of renderer.render(scene, camera).

const renderer = new THREE.WebGLRenderer()
// ... settings for the renderer are available in the Codesandbox below

const composer = new EffectComposer(renderer)
const renderPass = new RenderPass(scene, camera)

composer.addPass(renderPass)

function tick() {
	requestAnimationFrame(tick)
	composer.render()
}

tick()

There are two ways of creating a custom post-processing effect:

  1. Creating a custom shader and passing it to a ShaderPass instance, or
  2. Creating a custom render pass by extending the Pass class.

Because we want our post-processing effect to get more information than just uniforms and attributes, we will be creating a custom render pass.

Creating a custom render pass

While there isn’t much documentation currently available on how to write your own custom post-processing pass in Three.js, there are plenty of examples to learn from already inside the library. A custom pass inherits from the Pass class and has three methods: setSize, render, and dispose. As you may have guessed, we’ll mostly be focusing on the render method.

First we’ll start by creating our own PencilLinesPass that extends the Pass class and will later implement our own rendering logic.

import { Pass, FullScreenQuad } from 'three/examples/jsm/postprocessing/Pass'
import * as THREE from 'three'

export class PencilLinesPass extends Pass {
	constructor() {
		super()
	}

	render(
		renderer: THREE.WebGLRenderer,
		writeBuffer: THREE.WebGLRenderTarget,
		readBuffer: THREE.WebGLRenderTarget
	) {
		if (this.renderToScreen) {
			renderer.setRenderTarget(null)
		} else {
			renderer.setRenderTarget(writeBuffer)
			if (this.clear) renderer.clear()
		}
	}
}

As you can see, the render method takes in a WebGLRenderer and two WebGLRenderTargets, one for the write buffer and the other for the read buffer. In Three.js, render targets are basically textures to which we can render the scene, and they serve to send data between passes. The read buffer receives data in from the previous render pass, in our case that is the default RenderPass. The write buffer sends data out to the next render pass.

If renderToScreen is true, that means we want to send our buffer to the screen instead of to a render target. The renderer’s render target is set to null, so it defaults to the on-screen canvas.

At this point, we’re not actually rendering anything, not even the data coming in through the readBuffer. In order to get things rendered, we’ll need to create a FullscreenQuad and a shader material that will take care of rendering. The shader material gets rendered to the FullscreenQuad.

To test that everything is set up correctly, we can use the built-in CopyShader that will display whatever image we put into it. In this case, in this case the readBuffer‘s texture.

import { Pass, FullScreenQuad } from 'three/examples/jsm/postprocessing/Pass'
import { CopyShader } from 'three/examples/jsm/shaders/CopyShader'
import * as THREE from 'three'

export class PencilLinesPass extends Pass {
	fsQuad: FullScreenQuad
	material: THREE.ShaderMaterial

	constructor() {
		super()

		this.material = new THREE.ShaderMaterial(CopyShader)
		this.fsQuad = new FullScreenQuad(this.material)
	}

	dispose() {
		this.material.dispose()
		this.fsQuad.dispose()
	}

	render(
		renderer: THREE.WebGLRenderer,
		writeBuffer: THREE.WebGLRenderTarget,
		readBuffer: THREE.WebGLRenderTarget
	) {
		this.material.uniforms['tDiffuse'].value = readBuffer.texture

		if (this.renderToScreen) {
			renderer.setRenderTarget(null)
			this.fsQuad.render(renderer)
		} else {
			renderer.setRenderTarget(writeBuffer)
			if (this.clear) renderer.clear()
			this.fsQuad.render(renderer)
		}
	}
}

Note: we’re passing the uniform tDiffuse to the shader material. The CopyShader already has this uniform built in and it represents the image to be displayed on the screen. If you’re writing your own ShaderPass, this uniform will be passed in to your shader automatically.

All that’s left is to hook the custom render pass into the scene by adding it to the EffectComposer—after the RenderPass of course!

const renderPass = new RenderPass(scene, camera)
const pencilLinesPass = new PencilLinesPass()

composer.addPass(renderPass)
composer.addPass(pencilLinesPass)

View the Codesandbox example

Scene with a custom render pass and the CopyShader

Now that we have everything set up, we can actually get started with creating our special effect!

Sobel operator for creating outlines

We need to be able to tell the computer to detect lines based on our input image, in this case the rendered scene. The kind of edge detection we’ll be using is called the Sobel operator, and it only consists of a few steps.

The Sobel operator does edge detection by looking at the gradient of a small section of the image—essentially how sharp the transition from one value to another is. The image is broken down into smaller “kernels”, or 3px by 3px squares where the central pixel is the one currently being processed. The image below shows what this might look like: the red square in the center represents the current pixel being evaluated and the rest of the squares are its neighbors.

3px by 3px kernel

The weighted value for each neighbor is then calculated by taking the pixels value (brightness) and multiplying it by a weight based on its position relative to the pixel being evaluated. This is done with weights biasing the gradient horizontally and vertically. The average of both values is taken, and if it passes a certain threshold, we consider the pixel to represent an edge.

The horizontal and vertical gradients for the Sobel operator

While the implementation for the Sobel operator follows the image representations above almost directly, it still takes time to grasp. Thankfully we don’t have to implement our own as Three.js already provides us with working code for one in the SobelOperatorShader. We’ll copy this code over into our shader material.

Implementing the Sobel operator

Instead of the CopyShader, we’ll now need to add our own ShaderMaterial so that we have control over the vertex and fragment shaders, as well as the uniforms sent to those shaders.

// PencilLinesMaterial.ts
export class PencilLinesMaterial extends THREE.ShaderMaterial {
	constructor() {
		super({
			uniforms: {
				// we'll keep the naming convention here since the CopyShader
				// also used a tDiffuse texture for the currently rendered scene.
				tDiffuse: { value: null },
				// we'll pass in the canvas size here later
				uResolution: {
					value: new THREE.Vector2(1, 1)
				}
			},
			fragmentShader, // to be imported from another file
			vertexShader // to be imported from another file
		})
	}
}

We’ll get to the fragment and vertex shaders soon, but first we need to use our new shader material in the scene. We do this by swapping out the CopyShader. Don’t forget to pass the resolution—the canvas size—as a the shader’s uniform. While outside the scope of this tutorial, it’s also important to update this uniform when the canvas resizes.

// PencilLinesPass.ts
export class PencilLinesPass extends Pass {
	fsQuad: FullScreenQuad
	material: PencilLinesMaterial

	constructor({ width, height }: { width: number; height: number }) {
		super()
		
		// change the material from to our new PencilLinesMaterial
		this.material = new PencilLinesMaterial() 
		this.fsQuad = new FullScreenQuad(this.material)

		// set the uResolution uniform with the current canvas's width and height
		this.material.uniforms.uResolution.value = new THREE.Vector2(width, height)
	}
}

Next, we can start on the vertex and fragment shaders.

The vertex shader doesn’t do much except set the gl_Position value and pass the uv attribute to the fragment shader. Because we’re rendering our image to a FullscreenQuad, the uv information corresponds to the position on the screen of any given fragment.

// vertex shader
varying vec2 vUv;

void main() {

    vUv = uv;

    gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}

The fragment shader is a fair bit more complicated, so let’s break it down line by line. First we want to implement the Sobel operator using the implementation already provided by Three.js. The only difference is that we want to control how we calculate the value at each pixel, since we’ll be introducing line detection of the normal buffer as well.

float combinedSobelValue() {
    // kernel definition (in glsl matrices are filled in column-major order)
    const mat3 Gx = mat3(-1, -2, -1, 0, 0, 0, 1, 2, 1);// x direction kernel
    const mat3 Gy = mat3(-1, 0, 1, -2, 0, 2, -1, 0, 1);// y direction kernel

    // fetch the 3x3 neighbourhood of a fragment

    // first column
    float tx0y0 = getValue(-1, -1);
    float tx0y1 = getValue(-1, 0);
    float tx0y2 = getValue(-1, 1);

    // second column
    float tx1y0 = getValue(0, -1);
    float tx1y1 = getValue(0, 0);
    float tx1y2 = getValue(0, 1);

    // third column
    float tx2y0 = getValue(1, -1);
    float tx2y1 = getValue(1, 0);
    float tx2y2 = getValue(1, 1);

    // gradient value in x direction
    float valueGx = Gx[0][0] * tx0y0 + Gx[1][0] * tx1y0 + Gx[2][0] * tx2y0 +
    Gx[0][1] * tx0y1 + Gx[1][1] * tx1y1 + Gx[2][1] * tx2y1 +
    Gx[0][2] * tx0y2 + Gx[1][2] * tx1y2 + Gx[2][2] * tx2y2;

    // gradient value in y direction
    float valueGy = Gy[0][0] * tx0y0 + Gy[1][0] * tx1y0 + Gy[2][0] * tx2y0 +
    Gy[0][1] * tx0y1 + Gy[1][1] * tx1y1 + Gy[2][1] * tx2y1 +
    Gy[0][2] * tx0y2 + Gy[1][2] * tx1y2 + Gy[2][2] * tx2y2;

    // magnitude of the total gradient
    float G = (valueGx * valueGx) + (valueGy * valueGy);
    return clamp(G, 0.0, 1.0);
}

To the getValue function we pass in the offset from the current pixel, thus identifying which pixel in the kernel we are looking at to get that value. For the time being, only the value of the diffuse buffer is being evaluated, we’ll add the normal buffer in the next step.

float valueAtPoint(sampler2D image, vec2 coord, vec2 texel, vec2 point) {
    vec3 luma = vec3(0.299, 0.587, 0.114);

    return dot(texture2D(image, coord + texel * point).xyz, luma);
}

float diffuseValue(int x, int y) {
    return valueAtPoint(tDiffuse, vUv, vec2(1.0 / uResolution.x, 1.0 / uResolution.y), vec2(x, y)) * 0.6;
}

float getValue(int x, int y) {
    return diffuseValue(x, y);
}

The valueAtPoint function takes any texture (diffuse or normal) and returns the grayscale value at a specified point. The luma vector is used to calculate the brightness of a color, hence turning the color into grayscale. The implementation comes from glsl-luma.

Because the getValue function only takes into account the diffuse buffer, this means that any edge in the scene will be detected, including edges created by both cast shadows and core shadows. This also means that edges which we would intuit, such as the outlines of objects, could get missed if they blend in too well with their surroundings. To capture those missing edges, we’ll add edge detection from the normal buffer next.

Finally, we call the Sobel operator in the main function like this:

void main() {
    float sobelValue = combinedSobelValue();
    sobelValue = smoothstep(0.01, 0.03, sobelValue);

    vec4 lineColor = vec4(0.32, 0.12, 0.2, 1.0);

    if (sobelValue > 0.1) {
        gl_FragColor = lineColor;
    } else {
        gl_FragColor = vec4(1.0);
    }
}

View the Codesandbox example

Rendered scene with edge detection using the Sobel operator

Creating a normal buffer rendering

For proper outlines, the Sobel operator is often applied to the normals of the scene and the depth buffer, so the outlines of objects are captured, but not lines within the object. Omar Shehata describes such a method in his excellent How to render outlines in WebGL tutorial. For the purposes of a sketchy pencil effect, we don’t need complete edge detection, but we do want to use normals for more complete edges and for sketchy shading effects later.

Since the normal is a vector that represents the direction of an object’s surface at each point, it often gets represented with a color to get an image with all the normal data from the scene. This image is the “normal buffer.”

In order to create a normal buffer, first we need a new render target in the PencilLinesPass constructor. We also need to create a single MeshNormalMaterial on the class, since we’ll be using this to override the scene’s default materials when rendering the normal buffer.

const normalBuffer = new THREE.WebGLRenderTarget(width, height)

normalBuffer.texture.format = THREE.RGBAFormat
normalBuffer.texture.type = THREE.HalfFloatType
normalBuffer.texture.minFilter = THREE.NearestFilter
normalBuffer.texture.magFilter = THREE.NearestFilter
normalBuffer.texture.generateMipmaps = false
normalBuffer.stencilBuffer = false
this.normalBuffer = normalBuffer

this.normalMaterial = new THREE.MeshNormalMaterial()

In order to render the scene inside the pass, the render pass actually needs a reference to the scene and to the camera. We’ll need to send those through the constructor of the render pass as well.

// PencilLinesPass.ts constructor
constructor({ ..., scene, camera}: { ...; scene: THREE.Scene; camera: THREE.Camera }) {
	super()
	this.scene = scene
	this.camera = camera
    ...
}

Inside the pass’s render method, we want to re-render the scene with the normal material overriding the default materials. We set the renderTarget to the normalBuffer and render the scene with the WebGLRenderer as we normally would. The only difference is that instead of rendering to the screen with the scene’s default materials, the renderer renders to our render target with the normal material. Then we pass the normalBuffer.texture to the shader material.

renderer.setRenderTarget(this.normalBuffer)
const overrideMaterialValue = this.scene.overrideMaterial

this.scene.overrideMaterial = this.normalMaterial
renderer.render(this.scene, this.camera)
this.scene.overrideMaterial = overrideMaterialValue

this.material.uniforms.uNormals.value = this.normalBuffer.texture
this.material.uniforms.tDiffuse.value = readBuffer.texture

If at this point you were to set the gl_FragColor to the value of the normal buffer with texture2D(uNormals, vUv); this would be the result:

Normal buffer of the current scene

Instead, inside the custom material’s fragment shader, we want to modify the getValue function to include the Sobel operator value from the normal buffer as well.

float normalValue(int x, int y) {
    return valueAtPoint(uNormals, vUv, vec2(1.0 / uResolution.x, 1.0 / uResolution.y), vec2(x, y)) * 0.3;
}

float getValue(int x, int y) {
    return diffuseValue(x, y) + normalValue(x, y);
}

The result will look similar to the previous step, but we will be able to add additional noise and sketchiness with this normal data in the next step.

View the Codesandbox example

Sobel operator applied to the diffuse and normal buffers

Adding generated and textured noise for shading and squiggles

There are two ways to bring noise into the post-processing effect at this point:

  1. By procedurally generating the noise within the shader, or
  2. By using an existing image with noise and applying it as a texture.

Both providing different levels of flexibility and control. For the noise function, I’ve gone with Inigo Quilez’s gradient noise implementation, since it provides nice uniformity in the noise when applying to the “shading” effect”.

This noise function is called when getting the value of the Sobel operator and is specifically applied to the normal value, so the getValue function in the fragment shader changes like so:

float getValue(int x, int y) {
    float noiseValue = noise(gl_FragCoord.xy);
    noiseValue = noiseValue * 2.0 - 1.0;
    noiseValue *= 10.0;

    return diffuseValue(x, y) + normalValue(x, y) * noiseValue;
}

The result is a textured pencil line and stippled effect on object curves where the normal vector values change. Notice that flat objects, like the plane, do not get these effects, since they don’t have any variation in normal values.

The next and final step of this effect is to add distortion to the lines. For this I used a texture file created in Photoshop using the Render Clouds effect.

Generated clouds texture created in Photoshop

The cloud texture is passed to the shader through a uniform, the same way that the diffuse and normal buffers are. Once the shader has access to the texture, we can sample the texture for each fragment and use it to offset the location we’re reading from in the buffer. Essentially, we get the squiggled line effect by distorting the image we’re reading from, not by drawing to a different place. Because the texture’s noise is smooth, the lines don’t come out jagged and irregular.

float normalValue(int x, int y) {
    float cutoff = 50.0;
    float offset = 0.5 / cutoff;
    float noiseValue = clamp(texture(uTexture, vUv).r, 0.0, cutoff) / cutoff - offset;

    return valueAtPoint(uNormals, vUv + noiseValue, vec2(1.0 / uResolution.x, 1.0 / uResolution.y), vec2(x, y)) * 0.3;
}

You can also play around with how the effect is applied each buffer individually. This can lead to lines being offset from one another, giving an even better hand-drawn effect.

View the Codesandbox example

Final effect including normal buffer based “shading” and line distortion

Conclusion

There are many techniques to create hand-drawn or sketchy effects in 3D, this tutorial lists just a few. From here, there are multiple ways to go forward. You could adjust the line thickness by modulating the threshold of what is considered an edge based on a noise texture. You could also apply the Sobel operator to the depth buffer, ignoring the diffuse buffer entirely, to get outlined objects without outlining shadows. You could add generated noise based on lighting information in the scene instead of based on the normals of the objects. The possibilities are endless, and I hope this tutorial has inspired you to pursue them!