This blog series is a part of the write-up assignments of my Real-Time Game Rendering class in the Master of Entertainment Arts & Engineering program at University of Utah. The series will focus on C++, Direct3D 11 API and HLSL.

In this post, I will talk about the shaders that I created.

The first shader should contain a pattern that will not be affected by the location of the mesh in the world. Which means that the pattern needs to be determined by the local coordinates of each fragment (therefore, vertex).

My first attempt was basically converted the local coordinates into spherical coordinates. And the result is shown below. Gotta say, I think the deer looks pretty awesome.

However, I want to create a more similar pattern to what the teacher showed us. So, instead of using spherical coordinates. Later on, I realized that my sphere mesh is too small and that’s why if I just use some conditions like cos(x) > 0.5 to determine the color, it is actually affecting the whole sphere.

We can easily scale up the frequency of the cos waves to achieve the effect that I want. And it won’t be affected by the world location.

However, since the resolution of the mesh is so low, some obvious artifact started to show. On the side of the sphere, you can clearly see the triangle of the meshes. This is what happens if I calculate the results in my vertex shader and allow fragment (pixel) shader to only work with interpolated information. It is faster though since it was calculated only per each vertex.

If we want a smoother look, then we can pass the vertex position onto the fragment shader, and then let it handle the calculation. But of course, this is way more expensive since it’s calculating per fragment.

Next, we’re gonna create a shader which can give changing colors when the object moves through space. This can be achieved with basically the exact same shader with one small change. Instead of passing the “local” coordinate to the fragment shader, we can pass the “world” coordinate after the position has been transformed into world space.

What do we have to do to make a pulsing effect on a mesh? It turns out we can easily use sin/cos to create a scaling matrix and just transform the positions of each vertex in local coordinate, and that will give us a pulsing effect.

```// Apply scaling
{
float scaleFac = 0.25;
float scaleFreq = 2.0;
float4x4 scaleMat = float4x4(
1 + sin(g_elapsedSecondCount_simulationTime * scaleFreq) * scaleFac, 0.0, 0.0, 0.0,
0.0, 1 + sin(g_elapsedSecondCount_simulationTime * scaleFreq) * scaleFac, 0.0, 0.0,
0.0, 0.0, 1 + sin(g_elapsedSecondCount_simulationTime * scaleFreq) * scaleFac, 0.0,
0.0, 0.0, 0.0, 1.0
);
vertexPosition_local = Transform(scaleMat, vertexPosition_local);
}
```

UPDATE (01/22):

Actually, using matrix transformation is an overkill. I could just use simple multiplication and it will work fine!

```// for uniform scaling
vertexPosition_local.xyz *= 1 + sin(g_elapsedSecondCount_simulationTime * scaleFreq) * scaleFac;
// for non-uniform scaling
vertexPosition_local.xyz *= float3(1 + sin(g_elapsedSecondCount_simulationTime * scaleFreq) * scaleFac, 1 + cos(g_elapsedSecondCount_simulationTime * scaleFreq) * scaleFac, 1 - sin(g_elapsedSecondCount_simulationTime * scaleFreq) * scaleFac);
```

Moreover, we can do something crazy like applying different waves, different frequencies, or phase offsets to create some interesting effect.

Now we want to do something that will dynamically change according to where the camera is looking at it. The easiest effect to show is just changing the color of the fragments when the camera is close enough.

```// Combine local pattern + world pattern + depth color