Skip to content

LiveKit Unity SDK: Camera Renders Solid Green When Publishing from RenderTexture #134

@thePostFuturist

Description

@thePostFuturist

Summary

I'm experiencing a critical issue where LiveKit Unity SDK transmits a solid green video stream when publishing from a Unity Camera that renders to a RenderTexture. The publisher's local preview shows the correct rendered
scene, but all subscribers receive only a solid green color (RGB: 0.00, 0.53, 0.00).

Environment

  • Unity Version: 6.000.47f1 (using URP)
  • LiveKit Unity SDK: 1.2.4
  • Platform: Windows (Unity Editor)
  • Graphics API: DirectX 11
  • Server: LiveKit Cloud

Steps to Reproduce

  1. Create a Unity Camera that renders to a RenderTexture:
    GameObject camObj = new GameObject("Publisher Camera");
    Camera publishCamera = camObj.AddComponent();
    publishCamera.clearFlags = CameraClearFlags.SolidColor;
    publishCamera.backgroundColor = Color.black;
    publishCamera.allowHDR = false;
    publishCamera.allowMSAA = false;

RenderTexture rt = new RenderTexture(1280, 720, 24, RenderTextureFormat.ARGB32);
rt.Create();
publishCamera.targetTexture = rt;

  1. Create a simple 3D scene with colored objects
  2. Set up LiveKit publisher:
    Room room = new Room();
    yield return room.Connect(serverUrl, token);

CameraVideoSource videoSource = new CameraVideoSource(publishCamera);
LocalVideoTrack videoTrack = LocalVideoTrack.CreateVideoTrack("camera", videoSource, room);

var options = new TrackPublishOptions
{
VideoCodec = VideoCodec.H264,
Source = TrackSource.SourceCamera
};
yield return room.LocalParticipant.PublishTrack(videoTrack, options);

  1. Set up a subscriber in the same or different Unity instance
  2. Subscribe to the video track and display in a RawImage

Expected Behavior

The subscriber should see the 3D scene being rendered by the publisher's camera.

Actual Behavior

  • Publisher's RenderTexture preview shows the correct scene
  • Subscriber receives video but it's solid green (RGB: 0.00, 0.53, 0.00)
  • Video dimensions are correct (320x180, 640x360, 1280x720)
  • No errors in console, connection succeeds

Diagnostic Information

I created a diagnostic tool that samples the received texture at multiple points:
[GreenDiagnostic] Frame 30
Texture: 1280x720 (Texture2D)
Center: R:0.00 G:0.53 B:0.00
WARNING: All samples are the SAME color!

This confirms the transmitted video data is actually green, not a rendering issue on the subscriber side.

What I've Tried

  1. Different RenderTexture formats: ARGB32, BGRA32, Default
  2. WebRTC recommended format: WebRTC.GetSupportedRenderTextureFormat(SystemInfo.graphicsDeviceType) returns BGRA32
  3. Various camera settings: Disabled HDR, MSAA, dynamic resolution
  4. Different codecs: Both H264 and VP8 produce the same result
  5. Manual camera rendering: Calling camera.Render() continuously before and after creating the video source
  6. Different Unity versions and rendering pipelines

Additional Issues

  1. Unity Restart Required: After disconnecting, I must restart Unity to reconnect. Calling room.Disconnect() doesn't fully clean up resources.
  2. VideoStream._dirty flag: The internal _dirty flag appears to only be set for the first frame, requiring reflection workarounds to force updates.

Workaround Attempts

I've tried various workarounds including:

  • Continuous camera rendering loops
  • Using reflection to force VideoStream._dirty = true
  • Creating the RenderTexture with different formats and depth buffers
  • Clearing the RenderTexture before use
  • Using Graphics.Blit to copy textures

None of these resolve the green screen issue.

Root Cause Analysis

Based on my investigation, it appears that CameraVideoSource doesn't properly read pixel data from cameras targeting RenderTextures. The video track successfully transmits, but the actual pixel data is not being captured
from the RenderTexture, resulting in a default green color being sent instead.

This seems to be a fundamental issue with how the LiveKit Unity SDK's native layer interfaces with Unity's RenderTexture system.

Impact

This bug prevents using LiveKit for any Unity application that needs to stream rendered content (game footage, virtual cameras, AR/VR applications, etc.) rather than webcam input. It's a showstopper.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions