I’m making an attempt to effectively depend Shannon entropy (the -sum(pi * log pi) method) of a picture on macOS/iOS with swift. I’ve discovered Speed up framework and vImage features, which appear like what I’m in search of, nonetheless documentation is scarce and I obtained misplaced in it.
I’m creating vImage buffer like this
var format = vImage_CGImageFormat(
bitsPerComponent: 8,
bitsPerPixel: 8 * 4,
colorSpace: CGColorSpace(identify: CGColorSpace.displayP3)!,
bitmapInfo: .init(rawValue: CGImageAlphaInfo.noneSkipFirst.rawValue))!
let buf = attempt vImage.PixelBuffer(
cgImage: cg,
cgImageFormat: &format,
pixelFormat: vImage.Interleaved8x4.self)
My concept was to then convert it to 1 channel grayscale buffer ( vImage.PixelBuffer<vImage.Planar8>
) by buf.multiply()
, in keeping with this web page: https://developer.apple.com/documentation/speed up/converting_color_images_to_grayscale . Then create histogram from it, after which manually iterate over its 256 values and depend the sum. Nonetheless, evidently vImage.PixelBuffer<vImage.Planar8>
doesn’t have histogram()
methodology in any respect … whereas vImage.PixelBuffer<vImage.Interleaved8x4>
does.
Are you able to information me to appropriate approach to do it?