Building a custom renderer for Unity - Rendering opaques and transparents
In hindsight this post probably should've come earlier in the series rather than being the last one. Better late than never though and you don't need this kind of knowledge to understand what's in the other posts.
In this final post I'm going to discuss the difference between rendering an opaque material and rendering a transparent one. Hopefully by the end you understand why transparent materials are more expensive than opaques to render and maybe some cheat ways that you can simulate the effect of transparency without the cost.
The render queue
In the BiRP there were five default render queue tags: Background, Geometry, Alpha Test, Transparent and Overlay. You could chose what Rendering Mode in the material drop down. In the SRPs it is a similar story but there are also rendering events that you can hook into and add your own custom Render Objects.
The rendering events also expose information depending upon if you're using deferred rendering or forward rendering. For example, if you're using deferred rendering it is possible to create a Render Object that happens before/after the GBuffer is created.
Opaques
Opaque objects, by default, will write to the camera's depth buffer as they're drawn which is done to prevent overdraw in future passes, although it isn't fool proof. You can cause overdraw in the geometry queue if you have a complex mesh that has intersecting geometry (which is bad and whoever doing your 3D models should avoid). Even though skyboxes are opaque objects they are rendered towards the end of the pipeline because otherwise they would also incur overdraw.
It is possible to perform a depth pre-pass before opaques are rendered that will mean overdraw is mostly eliminated during the opaque pass. Any passes that require fragment shading thereafter will be able to compare the values in the depth buffer to determine which objects should be shaded and which should be skipped as they fail the z test. You can perform a depth pre-pass in both URP and HDRP but it is worth profiling and ensuring that your scene benefits from it.
Transparents
Transparent objects do not write to the depth buffer (shadows wouldn't work properly if they did) so they need to be rendered back to front.
To understand why, it's easiest to think about how you would paint this scene with a brush and paint. If you want to paint a scene where you have two pieces of transparent glass in front of each other but the one furthest away from the camera has a dark tint. Starting by painting the closest window to you first will mean that when it comes to paint the tinted window it will appear as though it's on top of the one that is was supposed to be closest.
Fun fact: The Painter's Algorithm is a real thing, it is a classical rendering technique that was an alternative to Rasterisation back in the day but it's too inefficent for real time. Rather than drawing pixel by pixel (like Rasterisation), it draws polygon by polygon after sorting the whole scene back to front. The downside of this is you can end up overdrawing all over the place as even though you're sorting by depth you're drawing the entire polygon each time so will invariably end up overdrawing the pixels.
This back to front sorting is known as 'z-sorting' and is a fairly expensive operation that requires sorting all of the render nodes that are part of the transparent pass. Not only is this expensive but it means that batching is also less efficent!
Cheap transparency - Alpha Clipping
It is possible to render opaque objects that have holes in them. In a fragment shader the clip(value)
function will discard the current fragment from being drawn if the value in the parentheses is less than 0. This means you would be able to see whatever is behind it because you wouldn't be overwritting the color buffer at that pixel. It also wouldn't write those clipped fragments to the depth buffer either so the skybox can still render in the gaps.
This is where the Alpha Test queue comes in. If you're going to be rendering some opaques with holes in them then it is important to render them after the rest of the opaques have been drawn as opaques are not by default z-sorted and you might not be using a depth pre-pass. In building the custom renderer the code for alpha clipping looks like this.
#if defined(_CLIPPING)
clip(base.a - _Cutoff);
// Discards fragment if param <= 0
#endif
Here, if clipping is enabled then we clip the pixel if the alpha value of the fragment is less than the cutoff which is defined in the shaders properties. As clipping is either discarding a fragment or not discarding it you can only render either the full color or nothing. If you need a translucent or fade effect then you must use render as part of the transparent pass.
Series Conclusion
Thanks for reading these posts. I've found it harder than I expected to write them but it feels like a good investment of time and it's really solidfied the things I've learnt as I'm absolutely terrified of saying something here that ends up being wrong! Even though this is (I think) the shortest post, it took ages because I spent half a day trying to figure out how the render queue in SRP worked which involved talking to the people who make the render pipeline 😅. If you've read any of this and found it particularly helpful then please let me know and it will motivate me to do more, or don't and I might end up doing it anyway!