2012-02-22 105 views
4

我必须从iPhone到服务器实时发送视频。我创建捕获会话并使用AVCaptureMovieFileOutput。如何发送流式视频从iOS设备到服务器?

NSError *error = nil;

captureSession = [[AVCaptureSession alloc] init]; 
// find, attach devices 
AVCaptureDevice *muxedDevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeMuxed]; 
if (muxedDevice) { 
    NSLog (@"got muxedDevice"); 
    AVCaptureDeviceInput *muxedInput = [AVCaptureDeviceInput deviceInputWithDevice:muxedDevice 
                      error:&error]; 
    if (muxedInput) { 
     [captureSession addInput:muxedInput]; 
    } 
} else { 
    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo]; 
    if (videoDevice) { 
     NSLog (@"got videoDevice"); 
     AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice 
                       error:&error]; 
     if (videoInput) { 
      [captureSession addInput: videoInput]; 
     } 
    } 
    AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeAudio]; 
    if (audioDevice) { 
     NSLog (@"got audioDevice"); 
     AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice 
                       error:&error]; 
     if (audioInput) { 
      [captureSession addInput: audioInput]; 
     } 
    } 
} 

// create a preview layer from the session and add it to UI 
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession]; 
previewLayer.frame = view.layer.bounds; 
previewLayer.videoGravity = AVLayerVideoGravityResizeAspect; 
previewLayer.orientation = AVCaptureVideoOrientationPortrait; 
[view.layer addSublayer:previewLayer]; 

// create capture file output 

captureMovieOutput = [[AVCaptureMovieFileOutput alloc] init]; 
if (! captureMovieURL) { 
    captureMoviePath = [[self getMoviePathWithName:MOVIE_FILE_NAME] retain]; 
    captureMovieURL = [[NSURL alloc] initFileURLWithPath:captureMoviePath]; 
} 
NSLog (@"recording to %@", captureMovieURL); 
[captureSession addOutput:captureMovieOutput]; 

我用AVAssetExportSession得到与持续时间10秒视频。

 AVURLAsset *asset = [AVURLAsset URLAssetWithURL:captureMovieURL options:[NSDictionary dictionaryWithObject:@"YES" forKey:AVURLAssetPreferPreciseDurationAndTimingKey]];

AVMutableComposition *composition = [AVMutableComposition composition]; 

CMTime endTime; 
CMTime duration = CMTimeMake(6000, 600); 
if (asset.duration.value - startFragment.value < 6000) 
{ 
    endTime = asset.duration; 
} 
else 
{ 
    endTime = CMTimeMake(startFragment.value + 6000, 600);   
} 
CMTimeRange editRange = CMTimeRangeMake(startFragment, duration); 
startFragment = CMTimeMake(endTime.value, 600); 
    NSError *editError = nil; 
// and add into your composition 

[组合物insertTimeRange:editRange ofAsset:资产atTime:composition.duration错误:& editError];

AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:composition presetName:AVAssetExportPresetPassthrough]; exportSession.shouldOptimizeForNetworkUse = YES; NSString *name = [NSString stringWithFormat:MOVUE_SEGMENT_NAME, countMovies]; NSString *path = [NSString stringWithFormat:@"file://localhost%@", [self getMoviePathWithName:name]]; NSURL *url = [NSURL URLWithString:path]; NSLog(@"urlsegment = %@", url); exportSession.outputFileType = AVFileTypeMPEG4; exportSession.outputURL = url; [exportSession exportAsynchronouslyWithCompletionHandler:^{ if (AVAssetExportSessionStatusCompleted == exportSession.status) { countMovies++; NSLog(@"AVAssetExportSessionStatusCompleted"); } else if (AVAssetExportSessionStatusFailed == exportSession.status) { NSLog(@"AVAssetExportSessionStatusFailed: %@", [exportSession.error localizedDescription]); } else { NSLog(@"Export Session Status: %d", exportSession.status); } }];

我视频发送到服务器,如果出口会话状态完成。但速度很慢。要获得持续10秒的电影,然后发送到服务器需要15秒。如果胶片的尺寸小于10秒,则没有任何变化。 我该如何解决这个问题?做这个的最好方式是什么?我怎么解决这个问题?什么更好地用于服务器上的视频流?

回答

0

使用ffmpeg编码元数据,它可能比AVAssetExportSession更好。但ffmpeg编码比AVAssetExportSession更难;

相关问题