6

如何在iOS 7中使用蓝牙或WiFi将摄像机馈送从一台iOS设备有效传输到另一台设备。以下是获取蒸汽缓冲液的代码。如何使用多点对等连接将摄像头从一台iOS设备传输到另一台设备

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
     didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
     fromConnection:(AVCaptureConnection *)connection 
{ 
    // Create a UIImage from the sample buffer data 
    UIImage *image = [self imageFromSampleBuffer:sampleBuffer]; 


} 

    // Create a UIImage from sample buffer data 
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer 
{ 
    // Get a CMSampleBuffer's Core Video image buffer for the media data 
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    // Lock the base address of the pixel buffer 
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 

    // Get the number of bytes per row for the pixel buffer 
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

    // Get the number of bytes per row for the pixel buffer 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    // Get the pixel buffer width and height 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    // Create a device-dependent RGB color space 
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    // Create a bitmap graphics context with the sample buffer data 
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, 
     bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    // Create a Quartz image from the pixel data in the bitmap graphics context 
    CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
    // Unlock the pixel buffer 
    CVPixelBufferUnlockBaseAddress(imageBuffer,0); 

    // Free up the context and color space 
    CGContextRelease(context); 
    CGColorSpaceRelease(colorSpace); 

    // Create an image object from the Quartz image 
    UIImage *image = [UIImage imageWithCGImage:quartzImage]; 

    // Release the Quartz image 
    CGImageRelease(quartzImage); 

    return (image); 
} 

这里我们可以得到iOS摄像头捕获的图像。

我们可以将样品缓冲区信息直接发送到使用多个对等设备的其他设备,还是有任何有效的方法将数据传送到其他iOS设备?

谢谢。

+0

多路连接听起来像是一个有效的选择。但是你需要检查性能。发送未压缩的图像可能需要太多带宽,因此您可能必须创建一个真正的视频流才能传输实时捕获。 – allprog 2014-09-15 11:18:56

+0

编辑必须是6个字符,所以除非我们拿出填充,这篇文章将永远蒸汽饲料 – 2015-08-18 19:08:21

+0

好问题Sandipbhai,Upvoted .. – NSPratik 2016-01-12 13:37:25

回答

0

我得到了这样做的方式,我们可以使用多点对点连接来流式传输压缩图像,使其看起来像流式摄像机。

一个对谁去送流将使用此code.In captureOutput委托方法:

 NSData *imageData = UIImageJPEGRepresentation(cgBackedImage, 0.2); 

    // maybe not always the correct input? just using this to send current FPS... 
    AVCaptureInputPort* inputPort = connection.inputPorts[0]; 
    AVCaptureDeviceInput* deviceInput = (AVCaptureDeviceInput*) inputPort.input; 
    CMTime frameDuration = deviceInput.device.activeVideoMaxFrameDuration; 
    NSDictionary* dict = @{ 
          @"image": imageData, 
          @"timestamp" : timestamp, 
          @"framesPerSecond": @(frameDuration.timescale) 
          }; 
    NSData *data = [NSKeyedArchiver archivedDataWithRootObject:dict]; 


    [_session sendData:data toPeers:_session.connectedPeers withMode:MCSessionSendDataReliable error:nil]; 

,并在接收端:

- (void)session:(MCSession *)session didReceiveData:(NSData *)data fromPeer:(MCPeerID *)peerID { 

// NSLog(@"(%@) Read %d bytes", peerID.displayName, data.length); 

    NSDictionary* dict = (NSDictionary*) [NSKeyedUnarchiver unarchiveObjectWithData:data]; 
    UIImage* image = [UIImage imageWithData:dict[@"image"] scale:2.0]; 
    NSNumber* framesPerSecond = dict[@"framesPerSecond"]; 


} 

我们将获得FPS值,并相应地我们可以设置参数来管理我们的流媒体图像。

希望它会有所帮助。

谢谢。

+0

桑德你是否尝试使用多点同步在两个iPhone设备之间流式传输音频文件? – 2016-05-05 10:16:56

0

下面是做到这一点的最好方法(和我解释为什么在结尾):

在iOS设备发送图像数据:接收图像数据

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
{ 
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 

    CVPixelBufferLockBaseAddress(imageBuffer,0); 
    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    CGImageRef newImage = CGBitmapContextCreateImage(newContext); 


    UIImage *image = [[UIImage alloc] initWithCGImage:newImage scale:1 orientation:UIImageOrientationUp]; 
    CGImageRelease(newImage); 
    CGContextRelease(newContext); 
    CGColorSpaceRelease(colorSpace); 
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0); 

    if (image) { 
     NSData *data = UIImageJPEGRepresentation(image, 0.7); 
     NSError *err; 
     [((ViewController *)self.parentViewController).session sendData:data toPeers:((ViewController *)self.parentViewController).session.connectedPeers withMode:MCSessionSendDataReliable error:&err]; 
    } 
} 

在iOS设备:

typedef struct { 
    size_t length; 
    void *data; 
} ImageCacheDataStruct; 

- (void)session:(nonnull MCSession *)session didReceiveData:(nonnull NSData *)data fromPeer:(nonnull MCPeerID *)peerID 
{ 
    dispatch_async(self.imageCacheDataQueue, ^{ 
     dispatch_semaphore_wait(self.semaphore, DISPATCH_TIME_FOREVER); 
     const void *dataBuffer = [data bytes]; 
     size_t dataLength = [data length]; 
     ImageCacheDataStruct *imageCacheDataStruct = calloc(1, sizeof(imageCacheDataStruct)); 
     imageCacheDataStruct->data = (void*)dataBuffer; 
     imageCacheDataStruct->length = dataLength; 

     __block const void * kMyKey; 
     dispatch_queue_set_specific(self.imageDisplayQueue, &kMyKey, (void *)imageCacheDataStruct, NULL); 

     dispatch_sync(self.imageDisplayQueue, ^{ 
      ImageCacheDataStruct *imageCacheDataStruct = calloc(1, sizeof(imageCacheDataStruct)); 
      imageCacheDataStruct = dispatch_queue_get_specific(self.imageDisplayQueue, &kMyKey); 
      const void *dataBytes = imageCacheDataStruct->data; 
      size_t length = imageCacheDataStruct->length; 
      NSData *imageData = [NSData dataWithBytes:dataBytes length:length]; 
      UIImage *image = [UIImage imageWithData:imageData]; 
      if (image) { 
       dispatch_async(dispatch_get_main_queue(), ^{ 
        [((ViewerViewController *)self.childViewControllers.lastObject).view.layer setContents:(__bridge id)image.CGImage]; 
        dispatch_semaphore_signal(self.semaphore); 
       }); 
      } 
     }); 
    }); 
} 

信号量和单独的GCD队列的原因很简单:您希望帧以相等的时间间隔显示。否则,视频开始有时会放慢速度,在加速超过正常速度以赶上之前。我的方案确保每个帧都以相同的速度连续播放,而不管网络带宽瓶颈。

相关问题