iOS相机CVPixelBuffer格式详解:YUV、RGB获取byte

1.iOS相机支持的CVPixelBuffer格式:

kCVPixelFormatType_32BGRA         = 'BGRA',     /* 32 bit BGRA */kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange = '420v', /* Bi-Planar Component Y'CbCr 8-bit 4:2:0, video-range (luma=[16,235] chroma=[16,240]).  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */kCVPixelFormatType_420YpCbCr8BiPlanarFullRange  = '420f', /* Bi-Planar Component Y'CbCr 8-bit 4:2:0, full-range (luma=[0,255] chroma=[1,255]).  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */ 

AVCaptureVideoDataOutputSampleBufferDelegate

  • 关于CVPixelBuffer格式的解释可以看我之前写的这一篇:
    iOS kCVPixelFormatType详解

2.如何设置相机的输出格式

2.1.需要先查询相机实际上支持哪些格式:
@interface AVCaptureVideoDataOutput
NSArray<NSNumber *> *availableVideoCVPixelFormatTypes
2.2.给相机设置格式
//'420f'
[videoOutput setVideoSettings:@{(id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)}];//'420v'
[videoOutput setVideoSettings:@{(id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)}];//'BGRA'
[videoOutput setVideoSettings:@{(id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA)}];

3.如何获取到CVPixelBuffer

  • 通过AVCaptureVideoDataOutput类的AVCaptureVideoDataOutputSampleBufferDelegate获取到相机每一帧的预览原始画面:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBufferfromConnection:(AVCaptureConnection *)connection;
  • 如果使用了GPUImage的相机📷,那么可以直接使用GPUImageGPUImageVideoCameraDelegate也可以获取到相机每一帧原始画面
- (void)willOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer;

4.如何获取BGRA格式的byte

- (void)willOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer {CVImageBufferRef cvImageBufferRef = CMSampleBufferGetImageBuffer(sampleBuffer);CVPixelBufferLockBaseAddress(cvImageBufferRef, 0);//获取BGRA的信息和byte地址size_t width = CVPixelBufferGetWidth(cvImageBufferRef);size_t height = CVPixelBufferGetHeight(cvImageBufferRef);size_t bytesPerRow = CVPixelBufferGetBytesPerRow(cvImageBufferRef);unsigned char *pImageData = (unsigned char *)CVPixelBufferGetBaseAddress(cvImageBufferRef);CVPixelBufferUnlockBaseAddress(cvImageBufferRef, 0);
}

5.如何获取YUV格式的byte

- (void)willOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer {CVImageBufferRef cvImageBufferRef = CMSampleBufferGetImageBuffer(sampleBuffer);CVPixelBufferLockBaseAddress(cvImageBufferRef, 0);//获取Y的信息和byte地址size_t widthY = CVPixelBufferGetWidthOfPlane(cvImageBufferRef, 0);size_t heightY = CVPixelBufferGetHeightOfPlane(cvImageBufferRef, 0);size_t bytesPerRowY = CVPixelBufferGetBytesPerRowOfPlane(cvImageBufferRef, 0);unsigned char *pImageDataY = (unsigned char *)CVPixelBufferGetBaseAddressOfPlane(cvImageBufferRef, 0);//获取UV的信息和byte地址size_t widthUV = CVPixelBufferGetWidthOfPlane(cvImageBufferRef, 1);size_t heightUV = CVPixelBufferGetHeightOfPlane(cvImageBufferRef, 1);size_t bytesPerRowUV = CVPixelBufferGetBytesPerRowOfPlane(cvImageBufferRef, 1);unsigned char *pImageDataUV = (unsigned char *)CVPixelBufferGetBaseAddressOfPlane(cvImageBufferRef, 1);CVPixelBufferUnlockBaseAddress(cvImageBufferRef, 0);
}


本文来自互联网用户投稿,文章观点仅代表作者本人,不代表本站立场,不承担相关法律责任。如若转载,请注明出处。 如若内容造成侵权/违法违规/事实不符,请点击【内容举报】进行投诉反馈!

相关文章

立即
投稿

微信公众账号

微信扫一扫加关注

返回
顶部