2

我从CMSampleBufferRef视频缓冲器得到一个的UIImage每N个视频帧,如:内存泄漏CMSampleBufferGetImageBuffer

- (void)imageFromVideoBuffer:(void(^)(UIImage* image))completion { 
    CMSampleBufferRef sampleBuffer = _myLastSampleBuffer; 
    if (sampleBuffer != nil) { 
     CFRetain(sampleBuffer); 
     CIImage *ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)]; 
     _lastAppendedVideoBuffer.sampleBuffer = nil; 
     if (_context == nil) { 
      _context = [CIContext contextWithOptions:nil]; 
     } 
     CVPixelBufferRef buffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
     CGImageRef cgImage = [_context createCGImage:ciImage fromRect: 
           CGRectMake(0, 0, CVPixelBufferGetWidth(buffer), CVPixelBufferGetHeight(buffer))]; 
     __block UIImage *image = [UIImage imageWithCGImage:cgImage]; 

     CGImageRelease(cgImage); 
     CFRelease(sampleBuffer); 

     if(completion) completion(image); 

     return; 
    } 
    if(completion) completion(nil); 
} 

Xcode和仪器检测内存泄漏,但我不能摆脱它。 我释放CGImageRef和CMSampleBufferRef像往常一样:

CGImageRelease(cgImage); 
CFRelease(sampleBuffer); 

[更新] 我放在AVCapture输出回调以获取sampleBuffer

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { 
    if (captureOutput == _videoOutput) { 
     _lastVideoBuffer.sampleBuffer = sampleBuffer; 
     id<CIImageRenderer> imageRenderer = _CIImageRenderer; 

     dispatch_async(dispatch_get_main_queue(), ^{ 
      @autoreleasepool { 
       CIImage *ciImage = nil; 
       ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)]; 
       if(_context==nil) { 
        _context = [CIContext contextWithOptions:nil]; 
       } 
       CGImageRef processedCGImage = [_context createCGImage:ciImage 
                  fromRect:[ciImage extent]]; 
       //UIImage *image=[UIImage imageWithCGImage:processedCGImage]; 
       CGImageRelease(processedCGImage); 
       NSLog(@"Captured image %@", ciImage); 
      } 
     }); 

泄漏的代码是createCGImage:ciImage

CGImageRef processedCGImage = [_context createCGImage:ciImage 
                  fromRect:[ciImage extent]]; 

甚至具有autoreleasepool,所述CGImage参考的CGImageReleaseCIContext作为实例属性。

这似乎是同样的问题,解决这里:Can't save CIImage to file on iOS without memory leaks

[更新] 泄漏似乎是由于一个错误。这个问题在 Memory leak on CIContext createCGImage at iOS 9?

很好地描述一个示例项目演示如何重现此泄漏:http://www.osamu.co.jp/DataArea/VideoCameraTest.zip

的最新留言确保

看起来他们固定这9.1b3。如果有人需要一种解决方法 iOS上9.0.x的作品,我能得到它这方面的工作:

在测试代码

(Swift在这种情况下):

[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) 
    { 
     if (error) return; 

     __block NSString *filePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"ipdf_pic_%i.jpeg",(int)[NSDate date].timeIntervalSince1970]]; 

     NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer]; 
     dispatch_async(dispatch_get_main_queue(),^ 
     { 

      @autoreleasepool 
      { 
       CIImage *enhancedImage = [CIImage imageWithData:imageData]; 

       if (!enhancedImage) return; 

       static CIContext *ctx = nil; if (!ctx) ctx = [CIContext contextWithOptions:nil]; 

       CGImageRef imageRef = [ctx createCGImage:enhancedImage fromRect:enhancedImage.extent format:kCIFormatBGRA8 colorSpace:nil]; 

       UIImage *image = [UIImage imageWithCGImage:imageRef scale:1.0 orientation:UIImageOrientationRight]; 

       [[NSFileManager defaultManager] createFileAtPath:filePath contents:UIImageJPEGRepresentation(image, 0.8) attributes:nil]; 

       CGImageRelease(imageRef); 
      } 
     }); 
    }]; 

和解决方法iOS9.0应该

extension CIContext { 
    func createCGImage_(image:CIImage, fromRect:CGRect) -> CGImage { 
     let width = Int(fromRect.width) 
     let height = Int(fromRect.height) 

     let rawData = UnsafeMutablePointer<UInt8>.alloc(width * height * 4) 
     render(image, toBitmap: rawData, rowBytes: width * 4, bounds: fromRect, format: kCIFormatRGBA8, colorSpace: CGColorSpaceCreateDeviceRGB()) 
     let dataProvider = CGDataProviderCreateWithData(nil, rawData, height * width * 4) {info, data, size in UnsafeMutablePointer<UInt8>(data).dealloc(size)} 
     return CGImageCreate(width, height, 8, 32, width * 4, CGColorSpaceCreateDeviceRGB(), CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedLast.rawValue), dataProvider, nil, false, .RenderingIntentDefault)! 
    } 
} 
+0

仪器说漏了什么?在哪里设置了_myLastSampleBuffer和_lastAppendedVideoBuffer.sampleBuffer? – ChrisH

+0

@ChrisH请参阅上面的代码。 – loretoparisi

+0

这不是Apple代码中的泄漏,只是在copyNextSampleBuffer的返回结果上不调用CFRetain(sampleBuffer),代码就可以正常工作。 – MoDJ

回答

3

我们正在经历的,我们创建了一个应用程序,我们正在处理与OpenCV的功能的关键点每帧类似的问题,并关闭发送一帧每两秒钟。经过一段时间的跑步后,我们最终会得到相当多的记忆压力信息。

我们设法通过像这样它自己的自动释放池中运行我们的处理代码来纠正这种(jpegDataFromSampleBufferAndCrop做类似于你在做什么东西,添加了裁剪):

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
{ 
     @autoreleasepool { 

      if ([self.lastFrameSentAt timeIntervalSinceNow] < -kContinuousRateInSeconds) { 

       NSData *imageData = [self jpegDataFromSampleBufferAndCrop:sampleBuffer]; 

       if (imageData) { 
        [self processImageData:imageData]; 
       } 

       self.lastFrameSentAt = [NSDate date]; 

       imageData = nil; 
      } 
     } 
    } 
} 
+0

谢谢,我找到了泄露的代码(见上文)。 – loretoparisi

+0

Zomg!有用!! –

+0

所以,看起来泄漏是由于iOS上的'''createCGImage'''中的错误引起的。看看https://forums.developer.apple.com/message/50981#50981 – loretoparisi

1

我可以证实,这内存泄漏在iOS 9.2上仍然存在。 (我也发布在Apple Developer Forum上。)

我在iOS 9.2上得到了相同的内存泄漏。我测试了使用MetalKit和MLKDevice来删除EAGLContext。我已经测试过使用CIContext的不同方法,如drawImage,createCGImage和render,但似乎没有任何工作。

很明显,这是iOS 9以来的一个bug。通过从Apple下载示例应用程序(请参见下文)尝试一下自己,然后在iOS 8.4的设备上运行相同的项目,然后在iOS 9.2的设备,并注意Xcode中的内存量表。与此

[EAGLContext setCurrentContext:_context]; 
_ciContext = [CIContext contextWithEAGLContext:_context]; 

118和finaly更换APLEAGLView.m:

下载https://developer.apple.com/library/ios/samplecode/AVBasicVideoOutput/Introduction/Intro.html#//apple_ref/doc/uid/DTS40013109

一下添加到APLEAGLView.h:20

@property (strong, nonatomic) CIContext* ciContext; 

更换APLEAGLView.m 341- 343与此

glDrawArrays(GL_TRIANGLE_STRIP, 0, 4); 

    @autoreleasepool 
    { 
     CIImage* sourceImage = [CIImage imageWithCVPixelBuffer:pixelBuffer]; 
     CIFilter* filter = [CIFilter filterWithName:@"CIGaussianBlur" keysAndValues:kCIInputImageKey, sourceImage, nil]; 
     CIImage* filteredImage = filter.outputImage; 

     [_ciContext render:filteredImage toCVPixelBuffer:pixelBuffer]; 
    } 

glBindRenderbuffer(GL_RENDERBUFFER, _colorBufferHandle);