- A huge upgrade to graphics – new way of rendering for achieving high fidelity visuals and realistic looking graphics features(Recommended for PC and console platforms).
- Makes heavy use of compute shaders: the targeted platforms must be compute shader-compatible.
- No need to choose a specific rendering path for using a particular feature: Deferred and Forward rendering path support the same set of features. Features like subsurface scattering, Screen space ambient occlusion and decals work the same for both paths.
- The lighting and the control of the light are fully linear: no Built-in Unity Gamma mode anymore.
- Advanced effects like parallax occlusion mapping, SSS and tessellation are available with minimal configuration.
Note: SRPs (Scriptable Render Pipelines) was released in Unity 2018.1 as Preview package so some features are still in development.
Features and improvements
- Subsurface Scattering (SSS) with Transmission
- Coat Mask
- Lit Shader / Layered Lit Shader
- Light Cookies (Coloured cookies support)
- Post-processing Stack v2 & Volume
- Procedural Sky
- GPU Instancing
- New options for transparent materials
- Image Based Lighting (IBL) improvements
- New light editor
- Lighting improvements
- Debug windows
- Area lights
- Parallax occlusion mapping
- Volume settings (Volumetric lighting does not yet support area lights)
- Atmospheric Scattering effect (still in development)
- Decal support (Still in development)
- Physically based camera effects (still in development)
- Character rendering tools (Still in development)
Limitations of HDRP
- Will not work on less powerful platforms (Minimum level: DirectX 11. Target supported APIs: D3D11, D3D12, GNM PlayStation 4, Metal and Vulkan. VR support is planned but not implemented as of 2018).
- Grabpass doesn’t exist anymore
- Only compatible with Unlit Particles: not compatible with Lit Particles currently.
- Artifacts in the built-in terrain system
- Anything previously rendered in the Overlay layer like lens flares on light isn’t supported (affects all SRP)
- Non-Asset Store assets incompatibility for shader based assets
- Writing custom shaders for HDRP now requires more knowledge – for a shader to be compatible with SRP, it needs to have a set of passes and tags specific to that pipeline. Currently there are no documentations for porting a shader to HDRP.
- Lightmapping is only viable for smaller scenes – long baking times and occasional glitches for both Enlighten and Progressive
- Screen Space Reflections doesn’t work with SRP as of 2018.
- Scalable ambient obscurance doesn’t work with SRP (Multi-scale Volumetric Occlusion mode should be used instead: it generally looks better and runs faster than the other mode on console and desktop platforms but requires compute shader support).
Features & Improvements
Subsurface Scattering (SSS / subsurface light transport) with Transmission
Subsurface Scattering works by simulating how light interacts and penetrates the surface of translucent objects. This optical phenomenon is caused by individual photons penetrating the skin, scattering inside it and finally exiting from a different point.
The lack of subsurface scattering is the reason for lifeless rubber-looking human characters in video games.
Useful for: skin, plant leaves, marble, milk, wax, snails and most non-metallic semi-transparent materials.
Example: the way that light changes colour as it shines through the human ear or fingertip – it travels and bounces underneath the surface.
SSS Alien skin test in Unity (3D Model by Antoine Dekerle):
Why isn’t SSS used more often in video games?
- Very expensive to compute: in order to simulate how light scatters inside an object, the traditional algorithms simulate and track up to thousands of light scattering events for every ray of light, recording their propagation inside each material for millions of rays. Such an approach would be too expensive for games.
- 3D models are usually empty: how different tissues (such as skins and muscles) scatter light is important information for SSS to work correctly: multiple meshes are required to solve this problem.
- The complexity of SSS requires bespoken solutions for each material: A SSS shader that works well on uniform materials (such as wax, marble and milk) might not perform as well on complex ones (skin and leaves).
- Sorting issues: Most engines approximate SSS using the depth buffer – a buffer (off-screen texture) that contains each pixel’s depth (distance from the camera). The thickness of a model is estimated by calculating the depth of back and front faces. This approach doesn’t work well due to sorting issues.
How is SSS implemented in Unity?
SSS is faked with Diffusion Profile – an image of the material to add SSS to. Diffusion Profile is a drastically faster solution which was developed by Activision Blizzard, University of Zaragoza and Technical University of Vienna: it simulates subsurface light transport in half a millisecond per image compared to hours before, using a completely different approach than just simulating many millions of light rays:
- Experiment with light sources and big blocks of translucent materials
- record how light bounces off of these materials – Only need to do it once per material – store the results in a diffusion profile image.
Diffusion Profile is a convolution based technique: this allows the optical properties of the diffusion profiles to be carried to the image, rather than simply adding two images together. when this is mixed with a photo-realistic skin image, photo-realistic looking faces can be achieved (if the optical properties of an apple is added to a human face, it will look more like a face that has been carved out of a giant apple).
Using SSS in HDRP
SSS(subsurface scattering) works well and the performance is quite good. A diffusion profile needs to be assigned to the object for SSS to work.
- Change colour of light absorbed by material
- use scale to change the SSS effect
- sample count: higher the sample slower the performance
Lit shader > material type: SSS, diffusion profile: skin
Similar to translucent, SSS is more accurate but more expensive.
- Create a new material
- Create Subsurface scattering settings.
- Create rendering diffusion profile setting
- Select from a list of Diffusion Profiles (such as Skin or Foliage) as a basis for SSS Materials.
- Put those diffusion profile settings into HDRP settings. Start with a physically plausible result and tailor to your liking.e.g. overemphasised close range scattering.
- Use the Transmission parameter to determine the translucency of an object by using a Thickness Map
HD Shadow settings in scene settings: max distance is normally set to 1000 but shadows may be buggy at that distance. Try setting it to a lower value.
Unity Asset Store Alternatives to SSS in HDRP
SSS is only available on high-end devices and requires the project to be set up with HDRP, however there are alternatives for SSS on mobile devices and VR:
- Translucent Subsurface Scattering: Works on mobile devices. Good for marble and other solid materials.
- Subsurface Scattering: Works on mobile and VR. Good for realistic skin.
Anisotropy simulates a surface material which changes properties depending on its orientation. Brushed metals are anisotropic (typical value: 1.0). It affects reflection on materials.
Values can be negative to change the orientation of the specular reflections.
Useful for: metallic materials (especially brushed metals)
Example: mimicking the look of brushed aluminum – use an Anisotropy Map with a Tangent to alter the intensity of the reflections and the orientation.
Anisotropy test in Unity:
The Iridescence shader provides the parameters to create an iridescent effect on the surface of the Material, similar to how light appears on an oil spill. The output is determined by an Iridescence Map and Iridescence Layer Thickness Map.
Useful for: glass, bubbles, iridescent paint jobs, clouds, shells, effects like oil stains
Example: Iridescence in soap bubbles
Iridescence test in Unity:
A clear coat effect that affects smoothness on surface.
Useful for: car paint, wet surfaces
Example: wet-looking reflection on a car’s window after the rain
Coat Mask test in Unity:
Lit Shader & Layered Lit Shader
Lit shader is the HDRP version of the Standard Shader: there are more features and material possibilities such as detail maps, double sided and the ability to mix various lit shaders together.
- New UV mapping options are available (planar and tri-planar).
- Ability to switch between the Metallic/Smoothness parameters (default) to Specular/Smoothness within the same shader.
- Opaque: solid material with no light penetration
- Transparent: higher performance costs
- Enabled: the shader will display on both sides (connected to Global Illumination automatically).
- Disabled: the shader will not render backfaces.
Metallic and Smoothness: control how material is reflecting the environment. A rough surface absorbs light and a smooth surface reflects light.
An efficient combination of:
- Red Channel – Metallic ranging from 0 to 1
- Green Channel – Ambient Occlusion
- Blue Channel – Detail Map Mask
- Alpha Channel – Smoothness
By default: textures imported into Unity use sRGB. Within the Texture Inspector, un-checking “sRGB (Color Texture)” converts the texture to using a Linear Format.
Since the Mask Map uses maths to generate an output, this texture must be linear.
How to create a Mask Map
- Make a new RGB file of the same size as your original grayscale source images.
- Open the channels panel, turn on only the desired channel and paste the corresponding grayscale image on it.
A secondary map that gives more space to work with details (such as skin pores). It is an amalgamation of additional maps which add minute detail to the Material.
Texture input is similar to Mask Map. The Detail Map uses four channels for efficiency reasons:
- Red Channel – Greyscale using Overlay Blending
- Green Channel – Normal Map Y channel
- Blue Channel – Smoothness
- Alpha Channel – Normal Map X channel
New configurable strength factor from 0 to 2.
The new Post Processing Stack relies on using volumes that describe how things should be drawn, either globally or within a certain area.
Prerequisites for the post processing volume to work:
- The option “is Global” must be ticked.
- The Postprocessing Volume must be put on a game object on the Post processing layer.
- A post processing layer must be added to the camera
- TAA (Temporal Anti-Aliasing) and motion blur showing better results
- PBR improvements (especially with low gloss materials) as effects were re-written.
- Improved bloom, vignetting and depth of field. volumetric lighting
- Post processing stack v2 is more flexible (and more complex) – can interpolate sets of effects and write custom effects.
- Cinemachine storyboard (available in extensions) with split view and waveform editor for cinematics references.
- 2018.1: Post Process v2, post process profiles, volumes, package manager
- 2017.x: post process v1, post process profiles, asset store
- 5x: individual effects stacked on camera
Light Cookie no longer works: HDRP uses standard textures as Light Cookies.
Need to change the cookie texture import settings and generate lighting again:
- Texture Type to default
- Texture Shape to Cube
- Mapping to Latitude-Longitude Layout (Cylindrical)
- Disable sRGB (Color Texture)
- Alpha Source to None
- Disable Border Mip Maps
- Wrap Mode to Clamp
These units will not match the arbitrary units that the built-in render pipeline uses, they need to be updated if upgrading from a legacy project.
- Start by adding a Directional Light to represent the main, natural light in this Scene. For example, a full Moon on a clear night sky has a luminous flux of around 0.25 Lux.
- Disable all other Lights in the Scene to exclusively see the effect of the Light representing the Moon.
HDRP handles the Sky differently to the built-in render pipeline, this enables the Sky parameters to be altered dynamically at run time using the Volume script.
Select GameObject > Rendering > Scene Settings and adjust the following settings for best effect:
- HD Shadow Settings : The maximum shadow distance and the directional shadow cascade settings.
- Visual Environment : The Sky and Fog type of your Scene.
- Procedural Sky :a port of the legacy procedural Sky – contains the same settings.
- Exponential Fog : The default Fog, that can handle fields such as Density, Color Mode, Fog Distance, and Fog Height.
Unity test: volumetric fog, emissive lighting test and Cinemachine storyboard for colour grading
The GameObject has a Baking Sky component which references the Volume’s procedural Sky. This component passes the Sky data to the lightmapper – only one should ever be present in the Scene at any time. Otherwise, Unity will exclusively use the first loaded Baking Sky component (a warning in shown in the console).
The Procedural Sky’s light intensity is expressed as Exposure and Multiplier. To convert to Lux, set exposure to 0 Exposure Value (EV) and use the Multiplier as the Lux value. To create believable visuals in this example, set the Multiplier to 0.02 Lux. increase the value to 0.05 to make the Scene more visible, while still being low enough to be plausible.
Generate Lighting in this Scene to create light bounces and directional soft shadows. Go to Window > Rendering > Lighting Settings and, near the bottom of the Scene tab, click Generate Lighting.
Tell the GPU to render many meshes of the same tile on the screen multiple times efficiently in one go, using only a small number of draw calls with the Graphics.DrawMeshInstanced API.
Must enable GPU Instancing on the material for this to work: Meshes will be rendered with the same geometry and material / shader in one batch when possible. This makes rendering faster – it’s possible to render thousands of objects with one draw call.
Useful for: drawing objects such as buildings, trees and grass, or other things that appear repeatedly.
Note: HDRP can not render Meshes in one batch if they have different meshes or Materials, or if the hardware does not support GPU instancing. For example, GameObjects with an animation base on the object pivot can’t be static batched (unique pivot for all) but they can be instanced by GPU.
How to overcome the limit of batches for rendering more than 1024 meshes?
For every instanced mesh, Unity can only handle a mesh count of 1024 (An error message will be generated saying that count must be in the range of 0 to 1023).
Use scripting to batch into different meshes if there are more than 1024 instances – this will have a performance cost since each batch is going to be rendered differently.
GPU Instancing test in Unity (generating 100 transparent bubbles):
New options for transparent materials
- Backface then front face rendering to help sorting
- Depth post pass to help with depth of field effect on transparent
Control refraction of transparent objects using parameter called Index of Refraction and Refraction Thickness. We can also apply distortion to set some blur to refraction.
Index of Refraction (IoR)
IoR is a way to define reflectivity. It determines how fast light travels through a material in relation to a vacuum.
The default value of 1 generates no refraction. Adjust this value (up to 2.5) to increase the refraction intensity. A value of 1.1 to 1.2 is generates refraction – effect of turning the environment upside down.
Useful for: reflective materials.
Example: Refracted reflection in water. An IOR value of 1.33 (typical for water) means that light travels 1.33 times slower through water than it does the empty vacuum of space.
IoR test in Unity:
HDRP comes with a new lighting architecture focused on performance:
- A hybrid Deferred / Forward & Tile / Cluster renderer.
- Scales better than built-in Unity rendering with the number of lights in the scene.
Display any material properties for both opaque and transparent materials, either with deferred or forward render path.
- lighting debug view: Diffuse lighting only, Specular lighting only.
- override properties for the whole scene like normal, albedo, smoothness.
- display intermediate render targets like motion vector, depth buffer.
- highlight properties like objects using lightmap or tessellation, having a NaN checker etc.
- Colour picker mode: read the current on-screen value or HDR value before postprocessing is applied.
- New customisable debugging tool to control debug view mode and render pipeline settings – can be used in the Unity Editor or in any Player. have all the debug functionality on any of the target devices like PlayStation 4.
HDRP uses camera relative rendering: good rendering precision even far away from the origin of the world. This has an impact on all the shaders used with HDRP.
Camera can control which lighting architecture is used (possible to mix deferred and forward renderer path in a scene) and which features are enabled for this rendering (possible to disable fog, shadow, post-processing etc).
New scene settings system based on volume settings similar to what is available for post processing. The scene settings (Sky, Sunlight cascade shadow, Screen space shadow contact etc…) can now be set per volume and the parameters can be interpolated to have a smooth transition between volume.
A new option available for Sky and Fog, like height based fog or fog tinted by the sky color. And the fog affects both opaque and transparent materials.
HDRP uses a dedicated render target allocation system that avoids recurrent reallocation when resizing the screen. This avoids extra render target allocation when doing dynamic resolution.
Image Based Lighting (IBL) improvements
Upgrade to Reflection probes:
- Can use oriented bounding box or sphere shape, the proxy shape (area approximating the scene geometry)
- Influence shape (an area where pixels are affected) are separated
- Various influence fading options (per face, based on normal orientation).
New light editor
- New Spotlight control on inner angles and different shapes (cone, box or pyramid).
- Options to Fade the light, affect only diffuse or specular lighting or use colour temperature to set up the colour of the light.
- allows the use of real-time area lights (no shadow or baking currently), like rectangle light.
- Area lights will be improved based on the research of Unity Labs team.
Parallax occlusion mapping
An enhancement of the parallax mapping technique to procedurally create 3D definition in textured surfaces, using a displacement map instead of through the generation of new geometry.
- Once HDR Pipeline is installed, create a HDR pipeline asset
- project settings > graphics – put the HDR pipeline asset there
- change colour space to linear, restart unity to get rid of the bugs. go to rendering scene settings: if you don’t restart the fog just appears as black.
- Create rendering density volume
- Volumetric fog: scene settings > fog type: set to volumetric
- go to scene > create rendering density volume, scale it up and adjust density slider as desired. change its colour to white, change mean free path to about 20
- create a new material, add base colour albedo inputs and normals => cool slider that affects how much normals affects the material.
- go to material and add it into the appropriate slot
- bent map is almost never used = Directional AO map (directionally dependent) – if looking from one spot: the light might be able to escape from it, if looking from another spot: it’ll be occluded, but it’s not often used.
- Coat mask: provides thin film simulation (index of refraction of 1) – like water. if increase it to 1 it looks wet.
- displacement mode: if have a flat surface / sphere, use pixel displacements. otherwise use vertex displacement. vertex displacement displaces vertices, pixel displacement is run for each pixel. if enabled, the material looks as if it’s got deeper crevices.
- smoothness remapping: in real world, nothing has smoothness of 1 or 0. unity provides a scale that controls the maximum and minimum value of smoothness to finetune materials.
- Ambient occlusion remapping
- Anistropy shader: increase metallic value and decrease smoothness value. change how the light is reflected. in real life, metals reflect light differently depending on their shape.
- transparent material: change surface type to transparent, make sure the pre refraction pass is not enabled. change opacity to something less. for the refraction model, change to sphere or plane. change SSRay model to proxy one. change the index of refraction(keep it lower than you might expect from the real world values, around 1.02)
- tick distortion and select distortion vector map – the map of vectors in two directions in which the light is distorted relative to the normal (blur effect)
- Lighting uses a Physical Light Units (PLU) system: these units are based on real-life measurable values, like what you would see when browsing for light bulbs at the store or measuring light with a photographic light meter.
- HDRP follows physical inverse square attenuation.
- LUX is used for Directional Lights because in the real world, those are the values used to measure the intensity of sunlight, which can be easily done with a LUX meter. Other real-world light sources use Lumens to measure intensity, which can be used as a reference for the smaller light emitters in our scene.
- The Sun Light intensity is defined in term of lux at the ground level, point lights and spot lights are defined in Lumen.
Realtime Line Lights
- Realtime Line Light light maintains a seamless, constant light output emanating from a line of a user-definable length.
- Line lights are commonly used in animated films to achieve realistic lighting – add a cinematic quality to the lighting.
- A lot of modern interiors use a style of Line Light to illuminate the space, so the Line Light here not only produces realistic lighting, but is accurate to what would be found in the real world.
- Line Lights can be created by selecting the shape type in the Inspector after a Light has been placed in a scene.
- The Light Inspector can determine the color of a light emitted through temperature. Ranging on a scale of 1000 to 20000 kelvins, the lower the value, the less heat is emitted, the light appears more red. In contrast, as you increase the temperature value, it appears more blue.
- Similarly, the Rectangle shape type emits a light output based on custom X and Y axis values.
- Shadows are currently not supported for Line or Rectangle light shape types.
Controls the colour and strength of specular reflections in the material.
Makes it possible to have a specular reflections of a different colour other than the diffuse reflection compared to metallic inputs, since Specular replaces Metallic inputs and converts it from slider to a colour space.
- Effective for simulating light interaction for vegetation and jade jewellery (any material that absorbs light deeply).
- Uses profiles (like SSS) – the thickness map is used to determine how light is transmitted.
- Uses a volumetric approach to scattering the light while skin uses diffuse. It’s less physically accurate but much faster to compute.
- simulate light transmission through object. This material type offers less costly solution than Subsurface Scattering but it’s kind a “fake” effect.
- Useful for creating volumetric fog, visual environment, shadows, reflections etc.
- Visually alter environment preferences, adjusting elements such as Visual Environment, Procedural Sky and HD shadow settings.
- Create custom volume profiles and switch between them.
- Volume Settings are managed by creating a GameObject and adding the Volume component(Similar to the workflow for creating a volume for the Post-Processing Stack v2).
How to set up Volumetric fog and lighting
- Make sure Volumetric Fog and Volumetric Lighting Controller are added to scene settings via ‘Add component overrides…’, and that boxes next to options in these components are ticked (eg by pressing All).
- Make sure Volumetric is selected in Scene Visual Environment Settings.
- Change the Default Mean Free Path – the default number is a very high, make this number very low to see the fog clearly.
- Alternatively: Create a new gameobject and attach a Density Volume component to it, scale and position the gameobject appropriately, play with Mean Free Path setting in this Density Volume.
How does HDRP simulate fog effects?
By overlaying a color onto objects, depending on their distance from the Camera. This is good for simulating fog or mist in outdoor environments.
can also use it to hide the clipping of far away GameObjects: handy when GameObject’s distance to a Camera’s far clip plane is reduced to enhance performance.
choose between different types of fog; Linear, Exponential, Volumetric. All Material types (Lit or Unlit) react correctly to the fog. HDRP calculates density differently, depending on the type of fog, the distance from the Camera, and the world space height.
Instead of using a constant color, Linear and Exponential fog can use the background sky as a source for color. In this case, HDRP samples the color from different mipmaps of the cubemap generated from the current sky settings. The chosen mip varies linearly between the lowest resolution and the highest resolution mipmaps, depending on the distance from the Camera and the values in the fog component’s Mip Fog properties.
Can also choose to limit the resolution of the highest mip that HDRP uses. Doing this adds a volumetric effect to the fog, it is much cheaper to use with Linear or Exponential fog than it is to use the Volumetric fog type.
Atmospheric Scattering effect (still in development)
Atmospheric scattering is the phenomena that occurs when particles suspended in the atmosphere diffuse (or scatter) a portion of the light passing through them in all directions.
Examples of natural effects that cause atmospheric scattering: fog, clouds, mist.
Decal support (Still in development)
- Every material has “Enable Decal” toggle
- Decal Projector” Component under GameObject->Render Pipeline>High Definition-> DecalProjector
- Decal support on both opaque and transparent material.
- Correctly affect the sampling of Global Illumination (lightmap/light probe)
Character rendering tools (Still in development)
Physically based camera effects (Still in development)
Physically based camera is the next step to get more coherent lighting and postprocessing.
How difficult is it to upgrade to HDRP?
Converting from/to HDRP can be a lot of work since material, lighting, settings are different. All the lighting, all the post-processing, scene settings, graphics settings and custom shaders will need to be re-worked.
Genearlly it’s not a good idea to start a project with HDRP and then switch to non HDRP (unless the project is not graphic focused or doesn’t make use of custom materials). The HDRP Material converter automatically converts Legacy Standards and Unlit Materials to HDRP Lit and Unlit Materials. Custom materials need to be converted manually.
HDRP uses a new set of Shaders and new lighting units -both are incompatible with the built-in Unity rendering pipeline.
- Add the HDRP package to existing project with Package Manager
- Create and set up a High Definition Render Pipeline Asset, assign it to the Scriptable Render Pipelines Settings.
- upgrade the materials in the scene to HDRP-compatible materials via Edit > Render Pipeline.
- change Color Space to Linear.
- Adjusting lights.
The Legacy Standard to Lit conversion process combines the different Material maps of the Legacy Standard into the separate RGBA channels of the mask map in the HDRP Lit Material. It also adds a smoothness detail.
The process blends detail albedo and smoothness with the base values using an overlay function(like Photoshop).
- HDRP Official Unity Guide: Getting Started
- Unity Docs: Scriptable Render Pipeline Docs
- Setting Up Lighting Pipeline: Best Practices
- Unity HDRP Blog
- Stripping scriptable shader variants
- Photogrammetery workflow
- Lit Shader Documentation
- Catlike Coding: SRP
- Catlike Coding: GPU Instancing
- HDRP Unity Stream (Chinese)
- Using HDRP (Japanese)
- Allegorithmic PBR Guide
- Layered Lit Shader
- Exhaustive list of the current state of development of HDRP
- GPU Instancing tutorial
- Utility scripts for the post processing stack: expose the post processing variables
- Unite talk on Post Processing Stack
- Allegorithmic PBR guide part 1
- Allegorithmic PBR guide part 2
- Unity Post-processing Wiki
- Supercharging Materials with the Scriptable Render Pipeline in Unity