Skip to content

The whole Image Processing pipeline with CIImage has issues with color spaces #342

@himanshunaidu

Description

@himanshunaidu
  1. Segmentation Mask as Color Image (potentially)

In SegmentationModelRequestProcessor, we are converting the pixel buffer from the Vision model to CIImage in the following way

:
let segmentationImage = CIImage(
            cvPixelBuffer: segmentationBuffer
)

This method ends up assuming that there is a color space associated with the image, which is not true.

However, passing the option of Null color space seems to be causing render issues. Somewhere in the segmentation pipeline, the mask CIImage needs to have a color space, when that should not be the case (current suspicion is the grayscale to color filter class).

Need to fix this pipeline, so that we can use Core Image as the 'backbone' of the application's Image Processing pipeline, where CIImage is conveniently passed across classes and threads, and converted to other formats efficiently when required.

  1. Color Segmentation Image as Non-Color Image (potentially)

In GrayscaleToColorFilter, we load the result image without a color space:
let resultImage = CIImage(mtlTexture: outputTexture, options: [.colorSpace: NSNull()])

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Relationships

None yet

Development

No branches or pull requests

Issue actions