Friday 22 June 2018

Guest Blog: The Unexpected Variability of Physically Based Viewports

In the last few years Physically Based Rendering has taken over most pipelines.

PBR is supposed to produce better results, and save a lot of artist time, and it has so far delivered on those promises. PBR materials look fairly good in many different lighting conditions, cutting down a lot of the fiddling that lighting changes in WIP game scenes used to cause.

To achieve that, PBR goes much more in-depth into the physics of light, than the very approximate models from earlier years. You might remember BlinnPhong, one of the few lighting models which used to be commonly used in games. It didn't include roughness, metallicness and other important characteristics of materials which we now might take for granted. To our current, more discerning, eyes, it tends to make everything look like plastic.

Example of lighting model supporting roughness

BlinnPhong can't adequately simulate all types of materials, but it was never supposed to be possible to have a one-size-fits-all lighting model. In the real world, materials can have wildly different characteristics, and many other lighting models have been devised and published in research papers over the years. Some of them were invented specifically for rendering, and some others for use in optics, manufacturing, and other kinds of applications.

All this various research on lighting models extended our understanding of how light ultimately works. We didn't seem to take advantage of it in games for some time though, mainly due to the limited power of our hardware. The advent of PBR depended on an increase in processing power, which duly happened.

Most 3D programs have introduced support for PBR. You may think that any PBR implementation should be the same, but PBR is a fairly flexible label. It only promises "physically based" rendering, and not "physically correct" rendering, and with good reason. The physics of light can get very complex, and a perfect simulation can't fit within real-time speed constraints, as of yet.

As a consequence of this inherent complexity, PBR systems are built out of a great number of choices between different, often equally valid, tradeoffs, and it's unlikely that all those choices will match perfectly between different teams and different programs. Implementors of PBR systems have to carefully consider many approximations, which will allow them to get to real-time speed, without giving up too much quality.

Subsurface scattering and translucency support

Different teams will have different priorities: a racing game might need better looking metallic materials, and having to fit within budget and time constraints, they may choose to offset that by neglecting the skin material implementation. They can get away with never showing a character up close, but the players are going to look at the cars all the time.

A team who’s developing a game engine for general use will need to produce decent looking materials for any possible object they can think of. The racing game team car material might end up looking better, compared to the general engine one, because it’s the main priority for that team.

This is how PBR implementations end up diverging, as everybody tries to do their best to optimise for their specific objectives. The only way that two different 3D programs, engines, or games, can match their PBR system perfectly is by mutual agreement to do so. Which is what Substance, Unity, and others, attempt to do.

But even if at some point in time two PBR implementations might match perfectly, they may still diverge in the future, as renderers get reimplemented, refined and updated.

As an example of the rapid progress of game renderers: Unity is currently developing two new renderers: the lightweight, and the HD rendering pipelines. Their new rendering system, the Scriptable Rendering Pipeline is scriptable, as it says on the tin. As a consequence, many new different ad-hoc renderers based on that may pop up, as game developers optimise the rendering pipeline for their needs. These new renderers will not be guaranteed to preserve the look of your material as it was in substance, even when they happen to be physically based.

Personally, I welcome our new rendering overlords, as I love writing graphics code, and to make even better PBR systems, we need to be able to go beyond the status quo. But this innovation may come with some breakage, as innovations usually do.

Claudia Doppioslash is a Graphics Programmer, a speaker and an author. She works as a game development consultant. She is the author of the book “Physically Based Shader Development for Unity 2017”, published by Apress, and of the Pluralsight course “Developing Custom Shaders in Unity”. She can be found on Twitter @doppioslash and is speaking at Develop:Brighton on Thursday 12th July 4PM in Room 3.