In this article by Raimondas Pupius, the author of the book Mastering SFML Game Development we will learn about normal maps and specular maps.
(For more resources related to this topic, see here.)
Lighting can be used to create visually complex and breath-taking scenes. One of the massive benefits of having a lighting system is the ability it provides to add extra details to your scene, which wouldn't have been possible otherwise. One way of doing so is using normal maps.
Mathematically speaking, the word "normal" in the context of a surface is simply a directional vector that is perpendicular to the said surface. Consider the following illustration:
In this case, what's normal is facing up because that's the direction perpendicular to the plane. How is this helpful? Well, imagine you have a really complex model with many vertices; it'd be extremely taxing to render the said model because of all the geometry that would need to be processed with each frame. A clever trick to work around this, known as normal mapping, is to take the information of all of those vertices and save them on a texture that looks similar to this one:
It probably looks extremely funky, especially if being looked of physical release in grayscale, but try not to think of this in terms of colors, but directions. The red channel of a normal map encodes the –x and +x values. The green channel does the same for –y and +y values, and the blue channel is used for –z to +z. Looking back at the previous image now, it's easier to confirm which direction each individual pixel is facing. Using this information on geometry that's completely flat would still allow us to light it in such a way that it would make it look like it has all of the detail in there; yet, it would still remain flat and light on performance:
These normal maps can be hand-drawn or simply generated using software such as Crazybump. Let's see how all of this can be done in our game engine.
In the case of maps, implementing normal map rendering is extremely simple. We already have all the material maps integrated and ready to go, so at this time, it's simply a matter of sampling the texture of the tile-sheet normals:
void Map::Redraw(sf::Vector3i l_from, sf::Vector3i l_to) {
...
if (renderer->UseShader("MaterialPass")) {
// Material pass.
auto shader = renderer->GetCurrentShader();
auto textureName = m_tileMap.GetTileSet().GetTextureName();
auto normalMaterial = m_textureManager->
GetResource(textureName + "_normal");
for (auto x = l_from.x; x <= l_to.x; ++x) {
for (auto y = l_from.y; y <= l_to.y; ++y) {
for (auto layer = l_from.z; layer <= l_to.z; ++layer) {
auto tile = m_tileMap.GetTile(x, y, layer);
if (!tile) { continue; }
auto& sprite = tile->m_properties->m_sprite;
sprite.setPosition(
static_cast<float>(x * Sheet::Tile_Size),
static_cast<float>(y * Sheet::Tile_Size));
// Normal pass.
if (normalMaterial) {
shader->setUniform("material", *normalMaterial);
renderer->Draw(sprite, &m_normals[layer]);
}
}
}
}
}
...
}
The process is exactly the same as drawing a normal tile to a diffuse map, except that here we have to provide the material shader with the texture of the tile-sheet normal map. Also note that we're now drawing to a normal buffer texture.
The same is true for drawing entities as well:
void S_Renderer::Draw(MaterialMapContainer& l_materials,
Window& l_window, int l_layer)
{
...
if (renderer->UseShader("MaterialPass")) {
// Material pass.
auto shader = renderer->GetCurrentShader();
auto textures = m_systemManager->
GetEntityManager()->GetTextureManager();
for (auto &entity : m_entities) {
auto position = entities->GetComponent<C_Position>(
entity, Component::Position);
if (position->GetElevation() < l_layer) { continue; }
if (position->GetElevation() > l_layer) { break; }
C_Drawable* drawable = GetDrawableFromType(entity);
if (!drawable) { continue; }
if (drawable->GetType() != Component::SpriteSheet)
{ continue; }
auto sheet = static_cast<C_SpriteSheet*>(drawable);
auto name = sheet->GetSpriteSheet()->GetTextureName();
auto normals = textures->GetResource(name + "_normal");
// Normal pass.
if (normals) {
shader->setUniform("material", *normals);
drawable->Draw(&l_window,
l_materials[MaterialMapType::Normal].get());
}
}
}
...
}
You can try obtaining a normal texture through the texture manager. If you find one, you can draw it to the normal map material buffer.
Dealing with particles isn't much different from what we've seen already, except for one little piece of detail:
void ParticleSystem::Draw(MaterialMapContainer& l_materials,
Window& l_window, int l_layer)
{
...
if (renderer->UseShader("MaterialValuePass")) {
// Material pass.
auto shader = renderer->GetCurrentShader();
for (size_t i = 0; i < container->m_countAlive; ++i) {
if (l_layer >= 0) {
if (positions[i].z < l_layer * Sheet::Tile_Size)
{ continue; }
if (positions[i].z >= (l_layer + 1) * Sheet::Tile_Size)
{ continue; }
} else if (positions[i].z <
Sheet::Num_Layers * Sheet::Tile_Size)
{ continue; }
// Normal pass.
shader->setUniform("material",
sf::Glsl::Vec3(0.5f, 0.5f, 1.f));
renderer->Draw(drawables[i],
l_materials[MaterialMapType::Normal].get());
}
}
...
}
As you can see, we're actually using the material value shader in order to give particles' static normals, which are always sort of pointing to the camera. A normal map buffer should look something like this after you render all the normal maps to it:
Now that we have all of this information, let's actually use it when calculating the illumination of the pixels inside the light pass shader:
uniform sampler2D LastPass;
uniform sampler2D DiffuseMap;
uniform sampler2D NormalMap;
uniform vec3 AmbientLight;
uniform int LightCount;
uniform int PassNumber;
struct LightInfo {
vec3 position;
vec3 color;
float radius;
float falloff;
};
const int MaxLights = 4;
uniform LightInfo Lights[MaxLights];
void main()
{
vec4 pixel = texture2D(LastPass, gl_TexCoord[0].xy);
vec4 diffusepixel = texture2D(DiffuseMap, gl_TexCoord[0].xy);
vec4 normalpixel = texture2D(NormalMap, gl_TexCoord[0].xy);
vec3 PixelCoordinates =
vec3(gl_FragCoord.x, gl_FragCoord.y, gl_FragCoord.z);
vec4 finalPixel = gl_Color * pixel;
vec3 viewDirection = vec3(0, 0, 1);
if(PassNumber == 1) { finalPixel *= vec4(AmbientLight, 1.0); }
// IF FIRST PASS ONLY!
vec3 N = normalize(normalpixel.rgb * 2.0 - 1.0);
for(int i = 0; i < LightCount; ++i) {
vec3 L = Lights[i].position - PixelCoordinates;
float distance = length(L);
float d = max(distance - Lights[i].radius, 0);
L /= distance;
float attenuation = 1 / pow(d/Lights[i].radius + 1, 2);
attenuation = (attenuation - Lights[i].falloff) /
(1 - Lights[i].falloff);
attenuation = max(attenuation, 0);
float normalDot = max(dot(N, L), 0.0);
finalPixel += (diffusepixel *
((vec4(Lights[i].color, 1.0) * attenuation))) * normalDot;
}
gl_FragColor = finalPixel;
}
First, the normal map texture needs to be passed to it as well as sampled, which is where the first two highlighted lines of code come in. Once this is done, for each light we're drawing on the screen, the normal directional vector is calculated. This is done by first making sure that it can go into the negative range and then normalizing it. A normalized vector only represents a direction.
Since the color values range from 0 to 255, negative values cannot be directly represented. This is why we first bring them into the right range by multiplying them by 2.0 and subtracting by 1.0.
A dot product is then calculated between the normal vector and the normalized L vector, which now represents the direction from the light to the pixel. How much a pixel is lit up from a specific light is directly contingent upon the dot product, which is a value from 1.0 to 0.0 and represents magnitude.
A dot product is an algebraic operation that takes in two vectors, as well as the cosine of the angle between them, and produces a scalar value between 0.0 and 1.0 that essentially represents how “orthogonal” they are. We use this property to light pixels less and less, given greater and greater angles between their normals and the light.
Finally, the dot product is used again when calculating the final pixel value. The entire influence of the light is multiplied by it, which allows every pixel to be drawn differently as if it had some underlying geometry that was pointing in a different direction.
The last thing left to do now is to pass the normal map buffer to the shader in our C++ code:
void LightManager::RenderScene() {
...
if (renderer->UseShader("LightPass")) {
// Light pass.
...
shader->setUniform("NormalMap",
m_materialMaps[MaterialMapType::Normal]->getTexture());
...
}
...
}
This effectively enables normal mapping and gives us beautiful results such as this:
The leaves, the character, and pretty much everything in this image now looks like they have a definition, ridges, and crevices; it is lit as if it had geometry, although it's paper-thin. Note the lines around each tile in this particular instance. This is one of the main reasons why normal maps for pixel art, such as tile sheets, shouldn't be automatically generated; it can sample the tiles adjacent to it and incorrectly add bevelled edges.
While normal maps provide us with the possibility to fake how bumpy a surface is, specular maps allow us to do the same with the shininess of a surface. This is what the same segment of the tile sheet we used as an example for a normal map looks like in a specular map:
It's not as complex as a normal map since it only needs to store one value: the shininess factor. We can leave it up to each light to decide how much shine it will cast upon the scenery by letting it have its own values:
struct LightBase {
...
float m_specularExponent = 10.f;
float m_specularStrength = 1.f;
};
Similar to normal maps, we need to use the material pass shader to render to a specularity buffer texture:
void Map::Redraw(sf::Vector3i l_from, sf::Vector3i l_to) {
...
if (renderer->UseShader("MaterialPass")) {
// Material pass.
...
auto specMaterial = m_textureManager->GetResource(
textureName + "_specular");
for (auto x = l_from.x; x <= l_to.x; ++x) {
for (auto y = l_from.y; y <= l_to.y; ++y) {
for (auto layer = l_from.z; layer <= l_to.z; ++layer) {
... // Normal pass.
// Specular pass.
if (specMaterial) {
shader->setUniform("material", *specMaterial);
renderer->Draw(sprite, &m_speculars[layer]);
}
}
}
}
}
...
}
The texture for specularity is once again attempted to be obtained; it is passed down to the material pass shader if found. The same is true when you render entities:
void S_Renderer::Draw(MaterialMapContainer& l_materials,
Window& l_window, int l_layer)
{
...
if (renderer->UseShader("MaterialPass")) {
// Material pass.
...
for (auto &entity : m_entities) {
... // Normal pass.
// Specular pass.
if (specular) {
shader->setUniform("material", *specular);
drawable->Draw(&l_window,
l_materials[MaterialMapType::Specular].get());
}
}
}
...
}
Particles, on the other hand, also use the material value pass shader:
void ParticleSystem::Draw(MaterialMapContainer& l_materials,
Window& l_window, int l_layer)
{
...
if (renderer->UseShader("MaterialValuePass")) {
// Material pass.
auto shader = renderer->GetCurrentShader();
for (size_t i = 0; i < container->m_countAlive; ++i) {
... // Normal pass.
// Specular pass.
shader->setUniform("material",
sf::Glsl::Vec3(0.f, 0.f, 0.f));
renderer->Draw(drawables[i],
l_materials[MaterialMapType::Specular].get());
}
}
}
For now, we don't want any of them to be specular at all. This can obviously be tweaked later on, but the important thing is that we have that functionality available and yielding results, such as the following:
This specularity texture needs to be sampled inside a light-pass shader, just like a normal texture. Let's see what this involves.
Just as before, a uniform sampler2D needs to be added to sample the specularity of a particular fragment:
uniform sampler2D LastPass;
uniform sampler2D DiffuseMap;
uniform sampler2D NormalMap;
uniform sampler2D SpecularMap;
uniform vec3 AmbientLight;
uniform int LightCount;
uniform int PassNumber;
struct LightInfo {
vec3 position;
vec3 color;
float radius;
float falloff;
float specularExponent;
float specularStrength;
};
const int MaxLights = 4;
uniform LightInfo Lights[MaxLights];
const float SpecularConstant = 0.4;
void main()
{
...
vec4 specularpixel = texture2D(SpecularMap, gl_TexCoord[0].xy);
vec3 viewDirection = vec3(0, 0, 1); // Looking at positive Z.
...
for(int i = 0; i < LightCount; ++i){
...
float specularLevel = 0.0;
specularLevel =
pow(max(0.0, dot(reflect(-L, N), viewDirection)),
Lights[i].specularExponent * specularpixel.a)
* SpecularConstant;
vec3 specularReflection = Lights[i].color * specularLevel *
specularpixel.rgb * Lights[i].specularStrength;
finalPixel +=
(diffusepixel * ((vec4(Lights[i].color, 1.0) * attenuation))
+ vec4(specularReflection, 1.0)) * normalDot;
}
gl_FragColor = finalPixel;
}
We also need to add in the specular exponent and strength to each light's struct, as it's now part of it. Once the specular pixel is sampled, we need to set up the direction of the camera as well. Since that's static, we can leave it as is in the shader.
The specularity of the pixel is then calculated by taking into account the dot product between the pixel’s normal and the light, the color of the specular pixel itself, and the specular strength of the light. Note the use of a specular constant in the calculation. This is a value that can and should be tweaked in order to obtain best results, as 100% specularity rarely ever looks good.
Then, all that's left is to make sure the specularity texture is also sent to the light-pass shader in addition to the light's specular exponent and strength values:
void LightManager::RenderScene() {
...
if (renderer->UseShader("LightPass")) {
// Light pass.
...
shader->setUniform("SpecularMap",
m_materialMaps[MaterialMapType::Specular]->getTexture());
...
for (auto& light : m_lights) {
...
shader->setUniform(id + ".specularExponent",
light.m_specularExponent);
shader->setUniform(id + ".specularStrength",
light.m_specularStrength);
...
}
}
}
The result may not be visible right away, but upon closer inspection of moving a light stream, we can see that correctly mapped surfaces will have a glint that will move around with the light:
While this is nearly perfect, there's still some room for improvement.
Lighting is a very powerful tool when used right. Different aspects of a material may be emphasized depending on the setup of the game level, additional levels of detail can be added in without too much overhead, and the overall aesthetics of the project will be leveraged to new heights. The full version of “Mastering SFML Game Development” offers all of this and more by not only utilizing normal and specular maps, but also using 3D shadow-mapping techniques to create Omni-directional point light shadows that breathe new life into the game world.
Further resources on this subject: