Use gamma textures format (WebGl2) #3715
Labels
area: graphics
Graphics related issue
feature
performance
Relating to load times or frame rate
V2
Work for initial release of V2 engine
Milestone
Currently, when the engine samples a texture stored in gamma space, it uses gammaCorrectInput function to bring the values to linear space for lighting, costing us a pow instruction per sample, and also bilinear interpolation when sampling takes place in sRGB space, which is slightly incorrect.
On Webgl2 devices, and also on Webgl1 devices with EXT_sRGB (https://developer.mozilla.org/en-US/docs/Web/API/EXT_sRGB), we could use hardware support for sRGB textures.
See here: (http://www.realtimerendering.com/blog/webgl-2-new-features/)
The sRGB texture will be automatically converted to linear space when being fetched in the shader.
The disadvantage here is that the texture needs to be marked as sRGB when we create a WebGl texture, meaning likely that we would need to expose a texture import setting in the Editor to allow textures to be marked as sRGB (diffuse and similar texture need sRGB, normal map, roughness and others don't as they are in linear space).
The text was updated successfully, but these errors were encountered: