Game Shader Programming - GLSL and HLSL for Visual Effects
Shader code can feel like a separate world from normal game logic. You are not updating transforms or health bars; you are telling the GPU how to draw pixels. Once you know the basics of GLSL (OpenGL Shading Language) and HLSL (High Level Shading Language), you can add custom visuals that node-based tools alone cannot easily do. This post gives you a practical footing in both and how they map to Unity and Unreal.
Why Write Shaders by Hand?
Unity has Shader Graph. Unreal has the Material Editor. So why write raw shader code? Node graphs are great for iteration and artists, but they can hide what is actually happening and sometimes get in the way of specific effects. Custom HLSL or GLSL lets you do things like:
- Implement a single, optimized pass instead of many nodes.
- Use loops, conditionals, and math that are awkward or impossible in the graph.
- Share logic between many materials via includes or functions.
- Hit a target performance profile by controlling exactly what runs on the GPU.
You do not have to choose one or the other. Many teams use Shader Graph or the Material Editor for most content and drop into code for the trickiest or most performance-sensitive effects. Our guide on shader programming goes deeper into when to use nodes vs code.
The Two Languages - GLSL vs HLSL
GLSL is used in OpenGL and Vulkan. You will see it in engines that target OpenGL (e.g. some Godot shaders, WebGL). HLSL is used in DirectX and by Unity and Unreal on most platforms. Syntax is similar; the main differences are naming (e.g. main() vs entry points), built-in names, and how you pass data between stages.
- Unity: Surface shaders and URP/HDRP shaders are usually written in HLSL. Unity compiles them for the right backend (e.g. GLES, Vulkan, D3D).
- Unreal: Materials and custom HLSL (e.g. in Material Functions or Plugin shaders) use HLSL.
If you learn one, the other is easy to pick up. This post uses HLSL-style examples that you can adapt in Unity or Unreal.
Vertex and Fragment (Pixel) Stages
A minimal pipeline has at least two stages:
- Vertex stage – Runs once per vertex. You take positions, normals, UVs, and output the same (or new) data for the next stage. You often transform position to clip space and pass varyings (e.g. UV, normal) to the fragment stage.
- Fragment (pixel) stage – Runs once per pixel. You receive the interpolated varyings and output a color (and optionally other data like depth). This is where most of the “look” of a material is computed.
In HLSL (Unity URP-style), a minimal vertex step could look like:
struct Attributes {
float4 positionOS : POSITION;
float2 uv : TEXCOORD0;
};
struct Varyings {
float4 positionCS : SV_POSITION;
float2 uv : TEXCOORD0;
};
Varyings Vert(Attributes input) {
Varyings output;
output.positionCS = TransformObjectToHClip(input.positionOS.xyz);
output.uv = input.uv;
return output;
}
The fragment shader then uses uv to sample a texture or compute a color. The idea is the same in GLSL: you have a vertex main and a fragment main, and varyings are passed between them.
Lighting - Diffuse and Specular
Many game shaders need some form of lighting. A simple model is:
- Diffuse – Lambert or similar: the color is scaled by how much the light direction aligns with the normal (
N dot L). That gives you basic shape and shading. - Specular – A highlight where the view direction aligns with the reflection of the light (e.g. Blinn-Phong with half-vector). That adds shine.
In code you usually:
- Get the normal (from vertex or normal map) and normalize it.
- Get the light direction (and view direction) in the same space (often world or tangent).
- Compute diffuse = saturate(dot(N, L)) and optionally specular = pow(saturate(dot(N, H)), specPower).
- Combine with light color and attenuation.
Unity and Unreal provide built-in lighting (e.g. URP’s GetMainLight, Unreal’s material nodes), but understanding this math helps when you go custom or debug why something looks wrong. For a full lighting and PBR walkthrough, see our game lighting design and shader resources.
Textures and UVs
Textures are sampled in the fragment stage. You use the interpolated UV (and optionally mip level or derivatives) to read from a texture. In HLSL you often see SAMPLE_TEXTURE2D(tex, sampler, uv). Tiling and offset are just uv * tiling + offset. For normal mapping, you sample a normal map and either use it in tangent space or transform it to world space with the TBN matrix. Once you have a world normal, you can plug it into the same N dot L and specular math.
Time and Animated Effects
Scrolling UVs, pulsing glow, or distortion often use time. The engine typically provides a time or delta value (e.g. _Time in Unity). You pass it as a uniform and use it in the vertex or fragment stage. For example:
- Scrolling:
uv += time * scrollSpeed - Pulse:
emission = baseEmission * (0.9 + 0.1 * sin(time)) - Distortion: offset UV or vertex position by a function of time and position.
Keep the math simple and avoid heavy branching so the GPU stays efficient. Our VFX and particle guides cover more animation and effect patterns.
Unity vs Unreal - Where Code Lives
In Unity, you write HLSL inside .shader files or Shader Graph custom function nodes. URP and HDRP use a lot of helper macros and include files (Core.hlsl, Lighting.hlsl, etc.). Your vertex and fragment functions are named and referenced in the Pass block. In Unreal, you write HLSL in Material Expressions (Custom node or Material Function) or in Plugin shaders. The engine handles the rest of the pipeline; you focus on the math and outputs (e.g. Base Color, Metallic, Roughness).
In both engines, start from an existing shader or template and change one thing at a time. That way you learn the boilerplate and conventions without getting lost.
Common Mistakes to Avoid
- Normal and vectors in different spaces – If your light is in world space, your normal must be in world space too. Mixing tangent and world space is a classic cause of wrong lighting.
- Not normalizing vectors – After interpolating, normals and directions are not unit length. Normalize before dot products.
- Expensive work per pixel – Move work to the vertex stage when possible (e.g. if it does not need to vary per pixel). Avoid loops with variable iteration counts and heavy branching.
- Ignoring mobile – Use fewer texture samples and simpler math on mobile; prefer simple lighting or baked look when needed.
When to Use Shader Graph or Material Editor Instead
Use node-based tools when:
- You want artists to own the look and iterate without touching code.
- The effect is standard (PBR, simple toon, scroll) and the graph is fast enough.
- You want to stay within the engine’s lighting and post-process pipeline with minimal custom code.
Use custom HLSL/GLSL when:
- You need a single, tight pass or very specific math.
- You are implementing a known technique from a paper or tutorial that is easier in code.
- You have hit limits or performance issues with the graph.
FAQ
Do I need to learn both GLSL and HLSL?
For most game dev (Unity, Unreal, desktop/mobile), HLSL is enough. GLSL is useful for WebGL, some Godot pipelines, or Vulkan. The concepts are the same; syntax differs slightly.
Can I use Shader Graph and custom code together?
Yes. In Unity you can call custom HLSL from Shader Graph (e.g. Custom Function node). In Unreal you can use Custom nodes or Material Functions that contain HLSL.
How do I debug a shader?
Use the engine’s shader debugger (e.g. Unity Frame Debugger, RenderDoc). Isolate the pass, inspect inputs and outputs, and narrow down where the math goes wrong. Returning a single color or value from a stage (e.g. return float4(uv, 0, 1);) helps verify data flow.
Where can I learn more?
Bookmark our game shader and VFX resources and the graphics programming guide. For Unity, the URP and HDRP shader samples in the package repository are a great reference. For Unreal, the Engine Shaders and Material Function library show how the engine does things.
Shader programming opens the door to custom visuals that make your game stand out. Start with a simple unlit or diffuse shader, then add one effect at a time – a texture, then lighting, then animation. Once you are comfortable with vertex and fragment stages and the basics of GLSL and HLSL, you can tackle more advanced effects and optimize for your target platforms. If this post helped, share it with your team or bookmark it for your next VFX task.