Skip to content

Commit

Permalink
Linear rendering / HDR pages (#704)
Browse files Browse the repository at this point in the history
* Linear rendering / HDR pages

* lint

* typo

* added few camera frame tips

* lint

* improvement

* Apply suggestions from code review

Co-authored-by: Will Eastcott <will@playcanvas.com>

---------

Co-authored-by: Martin Valigursky <mvaligursky@snapchat.com>
Co-authored-by: Will Eastcott <will@playcanvas.com>
  • Loading branch information
3 people authored Feb 10, 2025
1 parent bfab056 commit 9204380
Show file tree
Hide file tree
Showing 8 changed files with 172 additions and 1 deletion.
2 changes: 1 addition & 1 deletion docs/user-manual/engine/migrations.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ This guide provides an overview of all major breaking changes across releases, o

It’s advisable to use the debug version of the engine when troubleshooting issues, as it provides logs for deprecated messages, warnings, and errors related to incorrect usage.

## Migration from 1.4.0 to 2.5.0
## Migration from 2.4.0 to 2.5.0

### Breaking changes in 2.5.0

Expand Down
1 change: 1 addition & 0 deletions docs/user-manual/graphics/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ PlayCanvas incorporates an advanced graphics engine. Internally it uses the WebG
Some of its key features are as follows:

* Physically based rendering (PBR)
* Linear Workflow
* Directional, point and spot lights (all of which can cast shadows)
* Static and skinned mesh rendering
* GPU particle engine
Expand Down
98 changes: 98 additions & 0 deletions docs/user-manual/graphics/linear-workflow/hdr-rendering.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,98 @@
---
title: HDR Rendering
sidebar_position: 2
---

High Dynamic Range (HDR) rendering significantly enhances visual realism in computer graphics by capturing and displaying a broader spectrum of light and color. This technique ensures that both the brightest highlights and the deepest shadows retain their details, offering a more lifelike representation of scenes. One notable advantage of HDR rendering is its ability to produce physically based bloom effects, where intense light sources naturally bleed into surrounding areas, mimicking real-world camera and eye behavior. Additionally, HDR rendering facilitates more accurate reflections and refractions, as it allows for light values that exceed the standard displayable range, resulting in visuals that are both striking and true to life.

![HDR](/img/user-manual/graphics/linear-workflow/hdr.webp)

## Camera Settings

The camera provides two key settings for handling HDR rendering:

- **gammaCorrection**
- **toneMapping**

These settings can be configured based on the rendering mode.

### LDR (Low Dynamic Range)

- **toneMapping**: For LDR rendering, you can select any tone mapping method to achieve the desired visual style. The tone mapping compresses HDR values into displayable LDR values.
- **gammaCorrection**: Set to `GAMMA_SRGB` to indicate that the output should be stored in gamma space, as it represents colors.
- If the output pixel format is sRGB, gamma correction is handled by the hardware.
- Otherwise, gamma encoding is applied in shader code.

### HDR (High Dynamic Range)

For HDR rendering, the goal is to preserve HDR colour information:

- **toneMapping**: Set to `TONEMAP_LINEAR` to maintain HDR colours.
- **gammaCorrection**: Disable by setting to `GAMMA_NONE`.
- Ensure that a compatible HDR pixel format is used for the render target. This format can be obtained using the `GraphicsDevice.getRenderableHdrFormat()` API.

### HDR Display Output

When rendering in HDR mode, an HDR display output can be enabled by configuring the `Application` with the `displayFormat` parameter set to `DISPLAYFORMAT_HDR`.

- **toneMapping**: If HDR output is supported, set to `TONEMAP_NONE`.
- **gammaCorrection**: Keep set to `GAMMA_SRGB` to ensure low-intensity values remain visually similar to LDR rendering.
- After the device has been created, check if HDR display output is supported using `GraphicsDevice.isHdr()`. Note that for `isHdr()` to return `true`, the browser must be running on a display that supports HDR output.

**Note:** Currently, HDR display output is only supported by WebGPU. On other platforms, `GraphicsDevice.isHdr()` will always return `false`.

## PlayCanvas Engine - CameraFrame Class

The PlayCanvas Engine offers a comprehensive rendering setup through the `CameraFrame` class, which integrates advanced effects such as High Dynamic Range (HDR) rendering, bloom, Screen Space Ambient Occlusion (SSAO), and more. This setup enhances visual fidelity by simulating realistic lighting and post-processing effects.

### Key Features of CameraFrame

- **Bloom**: Simulates the scattering of light to create a glow around bright areas.
- **SSAO**: Enhances depth perception by simulating ambient light occlusion in crevices and corners.
- **Depth of Field (DoF)**: Mimics camera focus effects, blurring objects outside the focal plane.
- **Temporal Anti-Aliasing (TAA)**: Reduces visual artifacts by smoothing jagged edges over time.
- **Vignette**: Darkens the image's corners to draw attention to the center.
- **Color Grading**: Adjusts the color balance for stylistic effects.

### Configuring CameraFrame on a Camera

```javascript
const cameraFrame = new pc.CameraFrame(app, cameraEntity.camera);
cameraFrame.rendering.toneMapping = pc.TONEMAP_NEUTRAL;
cameraFrame.rendering.samples = 4;
cameraFrame.bloom.enabled = true;
cameraFrame.bloom.intensity = 0.01;
cameraFrame.update();
```

For HDR bloom to be effective, the scene should include bright light sources. This is typically achieved using emissive materials with high intensity. For example:

```javascript
material.emissive = pc.Color.YELLOW;
material.emissiveIntensity = 50;
```

For more detailed information, refer to the CameraFrame [API documentation](https://api.playcanvas.com/classes/Engine.CameraFrame.html).

## CameraFrame in the Editor

There is a `CameraScript` [available here](https://github.com/playcanvas/engine/blob/main/scripts/esm/camera-frame.mjs) for the PlayCanvas Editor project. This script integrates `CameraFrame` functionality directly into the Editor's Inspector, making it easy to set up and configure cameras with advanced rendering features.

### Instructions on Use

1. Add the `CameraScript` into your project and parse it.
2. Add it to an entity that has the `CameraComponent`.
3. Use the Inspector to configure the rendering settings for the camera, such as tone mapping, bloom, SSAO, and other effects.

This integration streamlines the process of setting up complex camera effects and enhances the overall workflow within the PlayCanvas Editor.

![CameraFrame Script](/img/user-manual/graphics/linear-workflow/camera-frame.png)

## CameraFrame Tips

- HDR bloom requires at least one renderable float format (e.g., RG11B10, RGBA16F, or RGBA32F). If none of these formats are supported by the device, HDR bloom is automatically disabled.
- The `toneMapping` property of `StandardMaterial` is ignored. Tonemapping is applied as a full-screen post-processing pass, so per-mesh tonemapping control is not possible.
- When using `CameraFrame`, two properties control tonemapping:
- `CameraFrame.rendering.toneMapping` – Controls tonemapping for the 3D scene rendered within the `CameraFrame`.
- `CameraComponent.toneMapping` – Controls tonemapping applied after the 3D scene including post-processing is rendered. This typically affects UI elements rendered on top.
- When using `CameraFrame`, you may notice differences in the intensity of alpha-blended geometry. This occurs because blending takes place in linear HDR space, which is more physically accurate than blending in gamma space. As a result, you may need to adjust material properties related to alpha blending.
30 changes: 30 additions & 0 deletions docs/user-manual/graphics/linear-workflow/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
---
title: Linear Workflow
sidebar_position: 4
---

In modern rendering engines, a linear workflow is essential for achieving physically accurate lighting and color representation. This approach ensures that all calculations, from shading to post-processing, occur in a linear color space, preventing errors introduced by gamma-compressed textures or incorrect blending. By working in linear space and applying gamma correction only at the final output stage, we maintain consistency across lighting, textures, and effects, resulting in more realistic and predictable visuals.

In engine v1, linear workflow was limited to `StandardMaterial`, but in engine v2, it is fully integrated across all shaders and rendering stages (including `ShaderMaterial`, UI rendering, particles, and every other element) ensuring consistent, physically accurate color processing throughout.

## Shader Input and Output Handling

A proper linear workflow ensures that all color calculations in the shader occur in a physically correct manner. This requires careful handling of both inputs and outputs to maintain accuracy throughout the rendering pipeline.

### **Shader Inputs: Ensuring Linear Data**

Shaders require all input values to be in linear space to avoid incorrect lighting results. This affects both textures and uniform color values:

- **Textures** that store color data (such as albedo maps) should be marked as **sRGB**. When a texture is sampled, the GPU automatically converts sRGB-encoded values into linear space, ensuring correct color calculations.
- **Color uniforms** are automatically converted from gamma space to linear space for `StandardMaterial`, particle rendering, and other built-in rendering systems. However, when setting uniforms manually using `Material.setParameter` or `MeshInstance.setParameter`, it is the caller's responsibility to ensure the values are provided in linear space. This is especially critical for `ShaderMaterial`, where all parameters must be explicitly defined using `setParameter`. To assist with this, the `Color` class provides the `Color.linear()` function, which converts gamma-space colors to linear space.

Once all inputs are in linear space, the shader performs lighting calculations with physically accurate results.

### **Shader Output: Managing Gamma Correction**

When writing the final color output, the handling of gamma correction depends on whether the rendering is LDR (Low Dynamic Range) or HDR (High Dynamic Range):

- **LDR Rendering**: Colors are gamma corrected immediately in the shader before being written to the render target, ensuring they are displayed correctly on standard monitors.
- **HDR Rendering**: Colors remain in linear space when written to the render target, typically requiring a **floating-point format** (e.g., `RGBA16F` or `RGBA32F`) to preserve precision and avoid banding. Gamma correction is then applied later, usually at the final tone-mapping or post-processing stage, allowing effects such as bloom and color grading to work with high-precision linear HDR colors.

This structured approach ensures that lighting, blending, and post-processing operate consistently, leading to more realistic and predictable rendering results.
42 changes: 42 additions & 0 deletions docs/user-manual/graphics/linear-workflow/textures.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
---
title: Textures
sidebar_position: 1
---

### sRGB Texture Handling

Textures that represent colors, such as Diffuse, Emissive, Specular, and Sheen, are typically stored in sRGB space to maintain color accuracy and reduce banding. When used by the engine, these textures are automatically converted from sRGB to linear space for correct lighting calculations. This conversion is performed by the GPU efficiently at no extra cost, provided that the texture is created using an sRGB format.

#### **Specifying sRGB Encoding for Textures**

When loading a texture asset that represents colors in sRGB space, it is important to specify sRGB encoding. The following example demonstrates how to create an asset with sRGB encoding:

```javascript
new pc.Asset(
'color',
'texture',
{ url: 'heart.png' },
{ encoding: 'srgb' }
);
```

#### **Marking sRGB Textures in the Editor**

When working in the Editor, ensure that the color texture is marked as **sRGB** in the inspector panel. This guarantees that the engine correctly interprets the texture as sRGB and applies the necessary conversion to linear space.

![sRGB](/img/user-manual/graphics/linear-workflow/srgb-editor.png)

#### **sRGB Procedural Textures / Render Targets**

When creating a procedural texture or rendering to a texture that represents color and will be read by a shader, it is important to create it with an **sRGB format** to enable automatic conversion. When rendering to this texture, linear values are automatically converted to gamma space to prevent banding. Later, when the texture is used as a color texture, pixels are automatically converted back to linear space.

The following example demonstrates how to create an sRGB render target texture:

```javascript
const texture = new pc.Texture(app.graphicsDevice, {
name: 'color-texture',
width: 512,
height: 512,
format: pc.PIXELFORMAT_SRGBA8
});
```
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 9204380

Please sign in to comment.