'How to convert ARKit Depth data (CVPixelBuffer) to 16-bit grayscale PNG image file?

According to Apple's official example, I made some attempts. This is my codes:

CGImage extension

//ARSessionDelegate
func session(_ session: ARSession, didUpdate frame: ARFrame) {
    let depthMap = frame.sceneDepth.depthMap
    let ciImage = CIImage(cvPixelBuffer: depthMap)
    let cgImage = CIContext().createCGImage(ciImage, from: ciImage.extent)
    let grayscaleCGImage = cgImage.toGrayscale() //CGImage extension
    let image = UIImage(cgImage: grayscaleCGImage)
    let data = image.pngData()
    
    //Save data to iPhone as "example.png" PNG file...
    //Get "example.png" PNG file from iPhone, copy PNG file to MacOS...
}

Then in MacOS terminal, I run this codes sips -g all example.png, I get the following information from terminal:

  pixelWidth: 256
  pixelHeight: 192
  typeIdentifier: public.png
  format: png
  formatOptions: default
  dpiWidth: 72.000
  dpiHeight: 72.000
  samplesPerPixel: 1
  bitsPerSample: 8
  hasAlpha: no
  space: Gray

The bitsPerSample is 8, But I want it to be 16. Can someone help me? Thanks a lot!



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source