-
Notifications
You must be signed in to change notification settings - Fork 45
Use shader for YUV to RGB conversion on received frames #168
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
chenosaurus
commented
Nov 7, 2025
- replace slow CPU conversion step to using a GPU shader
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR introduces GPU-accelerated YUV to RGB video frame conversion for LiveKit's Unity SDK. The change replaces the previous CPU-based RGBA conversion with an optimized shader-based approach that processes YUV420 planar data directly on the GPU.
Key changes:
- Added YUV to RGB conversion shader using BT.709 limited range color space conversion
- Refactored VideoStream to support both GPU and CPU conversion paths with fallback mechanism
- Changed output texture from Texture2D to RenderTexture to support GPU rendering pipeline
Reviewed Changes
Copilot reviewed 5 out of 5 changed files in this pull request and generated 4 comments.
Show a summary per file
| File | Description |
|---|---|
| Runtime/Shaders/YuvToRgb.shader | New HLSL shader implementing BT.709 YUV to RGB color space conversion |
| Runtime/Shaders/YuvToRgb.shader.meta | Unity metadata for shader asset |
| Runtime/Shaders.meta | Unity metadata for Shaders folder |
| Runtime/Scripts/VideoStream.cs | Refactored to use RenderTexture with GPU shader conversion and CPU fallback |
| LICENSE.txt.meta | Unity metadata for license file |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
Copilot reviewed 8 out of 8 changed files in this pull request and generated 4 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| } | ||
| // Unity objects must be destroyed on main thread | ||
| _converter?.Dispose(); | ||
| _converter = null; |
Copilot
AI
Nov 7, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The Texture property is no longer being cleaned up in the Dispose method. Previously, the code destroyed the Texture2D when disposing. Now that Texture is a RenderTexture owned by _converter, it should either be set to null here, or there should be a clear comment explaining that the converter handles the cleanup. Without this, there's a risk of holding a reference to a disposed RenderTexture.
| _converter = null; | |
| _converter = null; | |
| // Texture is owned and cleaned up by _converter. Set to null to avoid holding a reference to a disposed RenderTexture. | |
| Texture = null; |
| EnsurePlaneTexture(ref _planeU, chromaW, chromaH, TextureFormat.R8, FilterMode.Point); | ||
| EnsurePlaneTexture(ref _planeV, chromaW, chromaH, TextureFormat.R8, FilterMode.Point); |
Copilot
AI
Nov 7, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] The chroma plane filter mode should be FilterMode.Bilinear instead of FilterMode.Point for better quality YUV to RGB conversion. Using point filtering on the U and V planes can result in blocky artifacts in the chroma channels. Bilinear filtering helps smooth the upsampling of the half-resolution chroma planes.
| EnsurePlaneTexture(ref _planeU, chromaW, chromaH, TextureFormat.R8, FilterMode.Point); | |
| EnsurePlaneTexture(ref _planeV, chromaW, chromaH, TextureFormat.R8, FilterMode.Point); | |
| EnsurePlaneTexture(ref _planeU, chromaW, chromaH, TextureFormat.R8, FilterMode.Bilinear); | |
| EnsurePlaneTexture(ref _planeV, chromaW, chromaH, TextureFormat.R8, FilterMode.Bilinear); |
| tempTex.LoadRawTextureData((IntPtr)rgba.Info.DataPtr, (int)rgba.GetMemorySize()); | ||
| tempTex.Apply(); | ||
| Graphics.Blit(tempTex, Output); | ||
| UnityEngine.Object.Destroy(tempTex); | ||
| rgba.Dispose(); |
Copilot
AI
Nov 7, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Potential memory leak: tempTex is created but if Graphics.Blit throws an exception, the texture won't be destroyed and rgba won't be disposed. Consider wrapping the operations in a try-finally block to ensure cleanup:
var rgba = buffer.ToRGBA();
var tempTex = new Texture2D(width, height, TextureFormat.RGBA32, false);
try
{
tempTex.LoadRawTextureData((IntPtr)rgba.Info.DataPtr, (int)rgba.GetMemorySize());
tempTex.Apply();
Graphics.Blit(tempTex, Output);
}
finally
{
UnityEngine.Object.Destroy(tempTex);
rgba.Dispose();
}| tempTex.LoadRawTextureData((IntPtr)rgba.Info.DataPtr, (int)rgba.GetMemorySize()); | |
| tempTex.Apply(); | |
| Graphics.Blit(tempTex, Output); | |
| UnityEngine.Object.Destroy(tempTex); | |
| rgba.Dispose(); | |
| try | |
| { | |
| tempTex.LoadRawTextureData((IntPtr)rgba.Info.DataPtr, (int)rgba.GetMemorySize()); | |
| tempTex.Apply(); | |
| Graphics.Blit(tempTex, Output); | |
| } | |
| finally | |
| { | |
| UnityEngine.Object.Destroy(tempTex); | |
| rgba.Dispose(); | |
| } |
| HLSLPROGRAM | ||
| #pragma vertex vert | ||
| #pragma fragment frag | ||
| #include "UnityCG.cginc" |
Copilot
AI
Nov 7, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The shader uses HLSLPROGRAM/ENDHLSL but includes "UnityCG.cginc", which is a CG/HLSL include file. While this may work in Unity, the include should typically be "UnityShaderVariables.cginc" or a pure HLSL include when using HLSLPROGRAM. Consider using CGPROGRAM/ENDCG instead of HLSLPROGRAM/ENDHLSL for better compatibility with UnityCG.cginc, or switch to proper HLSL includes if using HLSLPROGRAM.
| #include "UnityCG.cginc" | |
| #include "UnityShaderVariables.cginc" |
ladvoc
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On-device testing was successful, just a few minor comments.
| { | ||
| public delegate void FrameReceiveDelegate(VideoFrame frame); | ||
| public delegate void TextureReceiveDelegate(Texture2D tex2d); | ||
| public delegate void TextureReceiveDelegate(Texture tex); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this constitutes a minor breaking API change—are we able to use Texture2D here? If not, we can note this in the release notes.
| private VideoStreamInfo _info; | ||
| private bool _disposed = false; | ||
| private bool _dirty = false; | ||
| private bool _useGpuShader = true; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you think we should expose an API to control this setting?
| return o; | ||
| } | ||
|
|
||
| float3 yuvToRgb709Limited(float y, float u, float v) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe you should be able to use half, that will save and boost the performance more.
Assuming you are using the legacy rendering pipeline:
inline half3 YUV709Limited_to_RGB(half y, half u, half v)
{
half c = y - half(16.0/255.0);
half d = u - half(128.0/255.0);
half e = v - half(128.0/255.0);
half Y = half(1.16438356) * c;
half3 rgb;
rgb.r = Y + half(1.79274107) * e;
rgb.g = Y - half(0.21324861) * d - half(0.53290933) * e;
rgb.b = Y + half(2.11240179) * d;
return saturate(rgb);
}
half4 frag(v2f i):SV_Target
{
half y = tex2D(_TexY, i.uv).r;
half u = tex2D(_TexU, i.uv).r; // U,V textures are W/2 x H/2
half v = tex2D(_TexV, i.uv).r;
return half4(YUV709Limited_to_RGB(y,u,v), 1.0h);
}