这是indexloc提供的服务,不要输入任何密码
Skip to content

Conversation

@chenosaurus
Copy link

  • replace slow CPU conversion step to using a GPU shader

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR introduces GPU-accelerated YUV to RGB video frame conversion for LiveKit's Unity SDK. The change replaces the previous CPU-based RGBA conversion with an optimized shader-based approach that processes YUV420 planar data directly on the GPU.

Key changes:

  • Added YUV to RGB conversion shader using BT.709 limited range color space conversion
  • Refactored VideoStream to support both GPU and CPU conversion paths with fallback mechanism
  • Changed output texture from Texture2D to RenderTexture to support GPU rendering pipeline

Reviewed Changes

Copilot reviewed 5 out of 5 changed files in this pull request and generated 4 comments.

Show a summary per file
File Description
Runtime/Shaders/YuvToRgb.shader New HLSL shader implementing BT.709 YUV to RGB color space conversion
Runtime/Shaders/YuvToRgb.shader.meta Unity metadata for shader asset
Runtime/Shaders.meta Unity metadata for Shaders folder
Runtime/Scripts/VideoStream.cs Refactored to use RenderTexture with GPU shader conversion and CPU fallback
LICENSE.txt.meta Unity metadata for license file

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@chenosaurus chenosaurus requested a review from Copilot November 7, 2025 23:25
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

Copilot reviewed 8 out of 8 changed files in this pull request and generated 4 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

}
// Unity objects must be destroyed on main thread
_converter?.Dispose();
_converter = null;
Copy link

Copilot AI Nov 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Texture property is no longer being cleaned up in the Dispose method. Previously, the code destroyed the Texture2D when disposing. Now that Texture is a RenderTexture owned by _converter, it should either be set to null here, or there should be a clear comment explaining that the converter handles the cleanup. Without this, there's a risk of holding a reference to a disposed RenderTexture.

Suggested change
_converter = null;
_converter = null;
// Texture is owned and cleaned up by _converter. Set to null to avoid holding a reference to a disposed RenderTexture.
Texture = null;

Copilot uses AI. Check for mistakes.
Comment on lines +106 to +107
EnsurePlaneTexture(ref _planeU, chromaW, chromaH, TextureFormat.R8, FilterMode.Point);
EnsurePlaneTexture(ref _planeV, chromaW, chromaH, TextureFormat.R8, FilterMode.Point);
Copy link

Copilot AI Nov 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] The chroma plane filter mode should be FilterMode.Bilinear instead of FilterMode.Point for better quality YUV to RGB conversion. Using point filtering on the U and V planes can result in blocky artifacts in the chroma channels. Bilinear filtering helps smooth the upsampling of the half-resolution chroma planes.

Suggested change
EnsurePlaneTexture(ref _planeU, chromaW, chromaH, TextureFormat.R8, FilterMode.Point);
EnsurePlaneTexture(ref _planeV, chromaW, chromaH, TextureFormat.R8, FilterMode.Point);
EnsurePlaneTexture(ref _planeU, chromaW, chromaH, TextureFormat.R8, FilterMode.Bilinear);
EnsurePlaneTexture(ref _planeV, chromaW, chromaH, TextureFormat.R8, FilterMode.Bilinear);

Copilot uses AI. Check for mistakes.
Comment on lines +132 to +136
tempTex.LoadRawTextureData((IntPtr)rgba.Info.DataPtr, (int)rgba.GetMemorySize());
tempTex.Apply();
Graphics.Blit(tempTex, Output);
UnityEngine.Object.Destroy(tempTex);
rgba.Dispose();
Copy link

Copilot AI Nov 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Potential memory leak: tempTex is created but if Graphics.Blit throws an exception, the texture won't be destroyed and rgba won't be disposed. Consider wrapping the operations in a try-finally block to ensure cleanup:

var rgba = buffer.ToRGBA();
var tempTex = new Texture2D(width, height, TextureFormat.RGBA32, false);
try
{
    tempTex.LoadRawTextureData((IntPtr)rgba.Info.DataPtr, (int)rgba.GetMemorySize());
    tempTex.Apply();
    Graphics.Blit(tempTex, Output);
}
finally
{
    UnityEngine.Object.Destroy(tempTex);
    rgba.Dispose();
}
Suggested change
tempTex.LoadRawTextureData((IntPtr)rgba.Info.DataPtr, (int)rgba.GetMemorySize());
tempTex.Apply();
Graphics.Blit(tempTex, Output);
UnityEngine.Object.Destroy(tempTex);
rgba.Dispose();
try
{
tempTex.LoadRawTextureData((IntPtr)rgba.Info.DataPtr, (int)rgba.GetMemorySize());
tempTex.Apply();
Graphics.Blit(tempTex, Output);
}
finally
{
UnityEngine.Object.Destroy(tempTex);
rgba.Dispose();
}

Copilot uses AI. Check for mistakes.
HLSLPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
Copy link

Copilot AI Nov 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The shader uses HLSLPROGRAM/ENDHLSL but includes "UnityCG.cginc", which is a CG/HLSL include file. While this may work in Unity, the include should typically be "UnityShaderVariables.cginc" or a pure HLSL include when using HLSLPROGRAM. Consider using CGPROGRAM/ENDCG instead of HLSLPROGRAM/ENDHLSL for better compatibility with UnityCG.cginc, or switch to proper HLSL includes if using HLSLPROGRAM.

Suggested change
#include "UnityCG.cginc"
#include "UnityShaderVariables.cginc"

Copilot uses AI. Check for mistakes.
Copy link
Contributor

@ladvoc ladvoc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On-device testing was successful, just a few minor comments.

{
public delegate void FrameReceiveDelegate(VideoFrame frame);
public delegate void TextureReceiveDelegate(Texture2D tex2d);
public delegate void TextureReceiveDelegate(Texture tex);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this constitutes a minor breaking API change—are we able to use Texture2D here? If not, we can note this in the release notes.

private VideoStreamInfo _info;
private bool _disposed = false;
private bool _dirty = false;
private bool _useGpuShader = true;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you think we should expose an API to control this setting?

return o;
}

float3 yuvToRgb709Limited(float y, float u, float v)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe you should be able to use half, that will save and boost the performance more.

Assuming you are using the legacy rendering pipeline:

inline half3 YUV709Limited_to_RGB(half y, half u, half v)
{
half c = y - half(16.0/255.0);
half d = u - half(128.0/255.0);
half e = v - half(128.0/255.0);

half Y = half(1.16438356) * c;
half3 rgb;
rgb.r = Y + half(1.79274107) * e;
rgb.g = Y - half(0.21324861) * d - half(0.53290933) * e;
rgb.b = Y + half(2.11240179) * d;
return saturate(rgb);

}

half4 frag(v2f i):SV_Target
{
half y = tex2D(_TexY, i.uv).r;
half u = tex2D(_TexU, i.uv).r; // U,V textures are W/2 x H/2
half v = tex2D(_TexV, i.uv).r;
return half4(YUV709Limited_to_RGB(y,u,v), 1.0h);
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants