In the Android project, using the Rust NDK Camera API, call the camera. How to convert YUV to RGB? #74
-
In the Android project, using the Rust NDK Camera API, call the camera. How to convert YUV to RGB? let mut y_row_stride = 0;
AImage_getPlaneRowStride(self.a_image, 0, &mut y_row_stride);
let mut uv_row_stride = 0;
AImage_getPlaneRowStride(self.a_image, 1, &mut uv_row_stride);
let mut y_len = 0;
let mut y_buffer_ptr = null_mut();
AImage_getPlaneData(self.a_image, 0, &mut y_buffer_ptr, &mut y_len);
let mut u_len = 0;
let mut u_buffer_ptr = null_mut();
AImage_getPlaneData(self.a_image, 1, &mut u_buffer_ptr, &mut u_len);
let mut v_len = 0;
let mut v_buffer_ptr = null_mut();
AImage_getPlaneData(self.a_image, 2, &mut v_buffer_ptr, &mut v_len);
let y_data = slice::from_raw_parts(y_buffer_ptr, y_len as usize);
let u_data = slice::from_raw_parts(u_buffer_ptr, u_len as usize);
let v_data = slice::from_raw_parts(v_buffer_ptr, v_len as usize);
let yuv_image = YuvPlanarImage {
y_plane: y_data,
y_stride: y_row_stride as u32,
u_plane: u_data,
u_stride: uv_row_stride as u32,
v_plane: v_data,
v_stride: uv_row_stride as u32,
width: self.width,
height: self.height,
};
let mut rgb = vec![0u8; (self.width * self.height * 3) as usize];
let rgba_stride = self.width * 3;
match yuv420_to_rgb(
&yuv_image,
&mut rgb,
rgba_stride,
YuvRange::Limited,
YuvStandardMatrix::Bt601,
) {
Ok(_) => {
}
Err(e) => {
}
} The result shows that the conversion was successful, but the image content is incorrect. Is the above code correct? ref: https://developer.android.com/ndk/reference/group/camera |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
I’ve already answered once, see the issue . Android default Yuv layout is NV12 or NV21, I’ve always mixing up because they are using different notation system. Only some buggy emulators will send YUV 4:2:0 for you. Your methods actually already trying to work partially as with NV21 layout but something goes wrong after. Then you get two planes, where one plane is Y and other plane is UV, construct YuvBiPlanarImage and everything should start working. I've never used exactly those APIs, but I think this should work this way. |
Beta Was this translation helpful? Give feedback.
I’ve already answered once, see the issue .
Android default Yuv layout is NV12 or NV21, I’ve always mixing up because they are using different notation system. Only some buggy emulators will send YUV 4:2:0 for you.
Your methods actually already trying to work partially as with NV21 layout but something goes wrong after.
After you get ‘uv_pixel_stride’ on the second plane by AImage_getPlanePixelStride check the value, it should be 2, that means data is interlaced and it is actually NV layout. If value is 1 then it is planar YUV.
Then you get two planes, where one plane is Y and other plane is UV, construct YuvBiPlanarImage and everything should start working.
I've never used exactly those …