This blog series is a part of the write-up assignments of my Real-Time Game Rendering class in the Master of Entertainment Arts & Engineering program at University of Utah. The series will focus on C++, Direct3D 11 API and HLSL.

In this post, I will talk about the shaders that I created.

Fixed Pattern Shader

The first shader should contain a pattern that will not be affected by the location of the mesh in the world. Which means that the pattern needs to be determined by the local coordinates of each fragment (therefore, vertex).

My first attempt was basically converted the local coordinates into spherical coordinates. And the result is shown below. Gotta say, I think the deer looks pretty awesome.

SphereStripeShader.PNG
Stripe pattern using spherical coordinates

However, I want to create a more similar pattern to what the teacher showed us. So, instead of using spherical coordinates. Later on, I realized that my sphere mesh is too small and that’s why if I just use some conditions like cos(x) > 0.5 to determine the color, it is actually affecting the whole sphere.

We can easily scale up the frequency of the cos waves to achieve the effect that I want. And it won’t be affected by the world location.

However, since the resolution of the mesh is so low, some obvious artifact started to show. On the side of the sphere, you can clearly see the triangle of the meshes. This is what happens if I calculate the results in my vertex shader and allow fragment (pixel) shader to only work with interpolated information. It is faster though since it was calculated only per each vertex.

SphereTriArtifact.PNG
Stripe pattern calculate in vertex shader

If we want a smoother look, then we can pass the vertex position onto the fragment shader, and then let it handle the calculation. But of course, this is way more expensive since it’s calculating per fragment.

SphereStripeWithFrag.PNG
Stripe pattern calculate in fragment shader

Changing Pattern Shader

Next, we’re gonna create a shader which can give changing colors when the object moves through space. This can be achieved with basically the exact same shader with one small change. Instead of passing the “local” coordinate to the fragment shader, we can pass the “world” coordinate after the position has been transformed into world space.

Pulsing Shader

What do we have to do to make a pulsing effect on a mesh? It turns out we can easily use sin/cos to create a scaling matrix and just transform the positions of each vertex in local coordinate, and that will give us a pulsing effect.

// Apply scaling
{
    float scaleFac = 0.25;
    float scaleFreq = 2.0;
    float4x4 scaleMat = float4x4(
        1 + sin(g_elapsedSecondCount_simulationTime * scaleFreq) * scaleFac, 0.0, 0.0, 0.0,
        0.0, 1 + sin(g_elapsedSecondCount_simulationTime * scaleFreq) * scaleFac, 0.0, 0.0,
        0.0, 0.0, 1 + sin(g_elapsedSecondCount_simulationTime * scaleFreq) * scaleFac, 0.0,
        0.0, 0.0, 0.0, 1.0
    );
    vertexPosition_local = Transform(scaleMat, vertexPosition_local);
}

UPDATE (01/22):

Actually, using matrix transformation is an overkill. I could just use simple multiplication and it will work fine!

// for uniform scaling
vertexPosition_local.xyz *= 1 + sin(g_elapsedSecondCount_simulationTime * scaleFreq) * scaleFac;
// for non-uniform scaling
vertexPosition_local.xyz *= float3(1 + sin(g_elapsedSecondCount_simulationTime * scaleFreq) * scaleFac, 1 + cos(g_elapsedSecondCount_simulationTime * scaleFreq) * scaleFac, 1 - sin(g_elapsedSecondCount_simulationTime * scaleFreq) * scaleFac);

Moreover, we can do something crazy like applying different waves, different frequencies, or phase offsets to create some interesting effect.

Dynamic Shader

Now we want to do something that will dynamically change according to where the camera is looking at it. The easiest effect to show is just changing the color of the fragments when the camera is close enough.

Maya Color as Mask

Now, let’s try to use the rgb color exported with our Maya model as a mask instead of actual color. We can use this to combine multiple different shader effects. To achieve this, I created a sphere inside Maya with roughly separated areas that contain only red, green, and blue accordingly.

ColorSphere.PNG
Maya sphere model

Now, I can combine the color calculation in the previous shaders (local pattern, world pattern, and dynamic color) into one single fragment shader. And map each of them to the correct channel accordingly. However, this is of course, very expensive since it is possible that only a third of the calculation actually matters. We can probably check each channel and just ignore it if it’s under some threshold values to try to save a little performance.

// Combine local pattern + world pattern + depth color
o_color.rgb = patternLColor * i_color.g + patternWColor  * i_color.r + dColor * i_color.b;

Below shows the combined effect.