2011-03-03 41 views
3

我正在写一个iPhone应用程序,它使用OpenCV进行某种实时图像检测。什么是将CMSampleBufferRef图像从相机(我使用AVFoundation的AVCaptureVideoDataOutputSampleBufferDelegate)转换为OpenCV能够理解的IplImage的最佳方法?转换需要足够快,以便它可以实时运行。将CMSampleBufferRef转换为OpenCV IplImage的最佳/最快方法是什么?

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
    fromConnection:(AVCaptureConnection *)connection 
{ 
    NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init]; 

    // Convert CMSampleBufferRef into IplImage 
    IplImage *openCVImage = ???(sampleBuffer); 

    // Do OpenCV computations realtime 
    // ... 

    [pool release]; 
} 

在此先感谢。

回答

12

此示例代码是基于苹果的样品管理CMSampleBuffer的指针:

- (IplImage *)createIplImageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer { 
    IplImage *iplimage = 0; 
    if (sampleBuffer) { 
     CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
     CVPixelBufferLockBaseAddress(imageBuffer, 0); 

     // get information of the image in the buffer 
     uint8_t *bufferBaseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0); 
     size_t bufferWidth = CVPixelBufferGetWidth(imageBuffer); 
     size_t bufferHeight = CVPixelBufferGetHeight(imageBuffer); 

     // create IplImage 
     if (bufferBaseAddress) { 
      iplimage = cvCreateImage(cvSize(bufferWidth, bufferHeight), IPL_DEPTH_8U, 4); 
      iplimage->imageData = (char*)bufferBaseAddress; 
     } 

     // release memory 
     CVPixelBufferUnlockBaseAddress(imageBuffer, 0); 
    } 
    else 
     DLog(@"No sampleBuffer!!"); 

    return iplimage; 
} 

你需要创建一个4通道的IplImage因为手机的摄像头缓存为BGRA。

根据我的经验,这种转换速度足够快,可以在实时应用程序中完成,但是当然,任何添加到其中的内容都会花费时间,特别是对于OpenCV。

+0

这种运作良好,640x480的图像转换时间为0.00020秒正负0.00004我的iPhone 4 – cduck 2011-03-04 04:06:55

+0

@cduck:是的,即使在3GS上,我使用此解决方案实现了30 fps。 – 2011-03-04 08:12:12

2

“iplimage-> imageData =(char *)bufferBaseAddress;”会导致内存泄漏。它应该是“memcpy(iplimage-> imageData,(char *)bufferBaseAddress,iplimage-> imageSize);”

如此完整的编码是:

-(IplImage *)createIplImageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer { 
    IplImage *iplimage = 0; 

    if (sampleBuffer) { 
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 

    // get information of the image in the buffer 
    uint8_t *bufferBaseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0); 
    size_t bufferWidth = CVPixelBufferGetWidth(imageBuffer); 
    size_t bufferHeight = CVPixelBufferGetHeight(imageBuffer); 

    // create IplImage 
    if (bufferBaseAddress) { 
     iplimage = cvCreateImage(cvSize(bufferWidth, bufferHeight), IPL_DEPTH_8U, 4); 

     //iplimage->imageData = (char*)bufferBaseAddress; 
     memcpy(iplimage->imageData, (char*)bufferBaseAddress, iplimage->imageSize); 
    } 

    // release memory 
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0); 
} 
else 
    DLog(@"No sampleBuffer!!"); 

return iplimage; 

}

相关问题