The last time Hackerfall tried to access this page, it returned a not found error. A cached version of the page is below, or clickhereto continue anyway

Implementing a "sketch" style of rendering in WebGL Floored

A component of many non-photorealistic rendering styles is outlines. We tested a few different ways to achieve this.

Normals

We found that convolving a Sobel filter with the normals produced confusing results. It fails on instances such as two walls of differing depths with the same normal. Due to aliasing, lines found are broken up, which results in more moving, broken outlines.

Depth

Using depth is also insufficient since it can produce extraneous edge shading on flat but foreshortened surfaces.

Planes

A plane-distance metric produced more usable results than computing the difference in normals or difference depths. Between point A and B, this measures the distance from As position in space (calculated in the frame from depth) to the plane formed by Bs normal (or vice versa). This results in edge detection fairly similar to change in depth but incorporates change in normals and excludes the extraneous shading on flat surfaces.

float planeDistance(const in vec3 positionA, const in vec3 normalA, 
                const in vec3 positionB, const in vec3 normalB) {
  vec3 positionDelta = positionB-positionA;
  float positionDistanceSquared = dot(positionDelta, positionDelta);
  float planeDistanceDelta = max(abs(dot(positionDelta, normalA)), abs(dot(positionDelta, normalB)));
  return planeDistanceDelta;
}

void main() {
  float depthCenter = decodeGBufferDepth(camera_uGBuffer, vUV, camera_uClipFar);  
  // ...
  // get positions and normals at cross neighborhood
  // ...

  vec2 planeDist = vec2(
planeDistance(posWest, geomWest.normal, posEast, geomEast.normal),
planeDistance(posNorth, geomNorth.normal, posSouth, geomSouth.normal)

);

  float edge = 240.0 * length(planeDist);
  edge = smoothstep(0.0, depthCenter, edge);

  gl_FragColor = vec4(vec3(1.0 - edge), 1.0);
}

Line-finding was not necessary in our case, as long as we choose an appropriate weighting to bring out the edges. We alpha-composite the Sobel-filtered result into the output buffer. A smoothstep operation on the edge darkness value improves clarity by boosting blacks and whites and reducing gray.

Shading

Our pipeline has several options for adding 3d shading and depth:

  1. Scalable Ambient Obscurance: discussed in a previous post, this generally darkens areas where geometry changes sharply.
  2. Normals-based shading: this differentiates the three cardinal wall directions. By comparing the faces to a predetermined light direction, they are appropriately assigned a shade of gray. Architectural sketches often have a dark floor, so our shading replicates this with dark ceilings and floors.
  3. Materials: if materials are assigned, we can use the color and/or normal maps.

Real-Time Hatching

We implemented a cross-hatching shader based on Microsoft Researchs Real-Time Hatching paper. The basic idea is to apply a different hatch texture to a fragment depending on its value. The paper uses six different hatch and crosshatch textures.

Blending

Initially, we assumed six steps of value, one for each of the six textures. At each step, we blended two textures so that we would get a nice gradient across objects.

However, we found that this led to precision issues along the borders between sets of textures and we had very nasty artifacts. Instead, we decided to blend all six textures always (even if sometimes the weight of a particular texture is 0.0). This simple change vastly improved our results.

Continue reading on floored.com