2011-03-30 47 views
18

我想用AVAssetWriter导出电影,但无法弄清楚如何同时包含视频和音频轨道。导出唯一的视频工作正常,但是当我添加音频最终的影片看起来是这样的:如何使用AVAssetWriter写影片和音频的电影?

首先,我看到视频(无音频),然后将视频冻结(显示最后一个图像帧,直到结束),并经过一些秒我听到音频。

我尝试了一些事情CMSampleBufferSetOutputPresentationTimeStamp(减去从当前第一CMSampleBufferGetPresentationTimeStamp)的音频,但是这一切没有工作,我不认为这是正确的方向,因为在源视频&音频电影应该是同步反正...

我在很短的设置:我创建一个AVAssetReader和2 AVAssetReaderTrackOutput(一个用于视频,一个用于音频),并将它们添加到AVAssetReader,然后创建一个AVAssetWriter和2 AVAssetWriterInput(视频&音频)并将它们添加到AVAssetWriter ...我开始完成:

[assetReader startReading]; 
[assetWriter startWriting]; 
[assetWriter startSessionAtSourceTime:kCMTimeZero]; 

然后我跑2个队列做样品缓冲东西:

dispatch_queue_t queueVideo=dispatch_queue_create("assetVideoWriterQueue", NULL); 
[assetWriterVideoInput requestMediaDataWhenReadyOnQueue:queueVideo usingBlock:^ 
{ 
    while([assetWriterVideoInput isReadyForMoreMediaData]) 
    { 
     CMSampleBufferRef sampleBuffer=[assetReaderVideoOutput copyNextSampleBuffer]; 
     if(sampleBuffer) 
     { 
      [assetWriterVideoInput appendSampleBuffer:sampleBuffer]; 
      CFRelease(sampleBuffer); 
     } else 
     { 
      [assetWriterVideoInput markAsFinished]; 
      dispatch_release(queueVideo); 
      videoFinished=YES; 
      break; 
     } 
    } 
}]; 

dispatch_queue_t queueAudio=dispatch_queue_create("assetAudioWriterQueue", NULL); 
[assetWriterAudioInput requestMediaDataWhenReadyOnQueue:queueAudio usingBlock:^ 
{ 
    while([assetWriterAudioInput isReadyForMoreMediaData]) 
    { 
     CMSampleBufferRef sampleBuffer=[assetReaderAudioOutput copyNextSampleBuffer]; 
     if(sampleBuffer) 
     { 
      [assetWriterAudioInput appendSampleBuffer:sampleBuffer]; 
      CFRelease(sampleBuffer); 
     } else 
     { 
      [assetWriterAudioInput markAsFinished]; 
      dispatch_release(queueAudio); 
      audioFinished=YES; 
      break; 
     } 
    } 
}]; 

在主循环我等两个队列,直到他们完成:

while(!videoFinished && !audioFinished) 
{ 
    sleep(1); 
} 
[assetWriter finishWriting]; 

而且我试着用下面的代码将结果文件保存在库中...

NSURL *url=[[NSURL alloc] initFileURLWithPath:path]; 
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init]; 
if([library videoAtPathIsCompatibleWithSavedPhotosAlbum:url]) 
{ 
    [library writeVideoAtPathToSavedPhotosAlbum:url completionBlock:^(NSURL *assetURL, NSError *error) 
    { 
     if(error) 
      NSLog(@"error=%@",error.localizedDescription); 
     else 
      NSLog(@"completed..."); 
    }]; 
} else 
    NSLog(@"error, video not saved..."); 

[library release]; 
[url release]; 

...但我得到的错误:

Video /Users/cb/Library/Application Support/iPhone Simulator/4.2/Applications/E9865BF9-D190-4912-9248-66768B1AB635/Documents/export.mp4 cannot be saved to the saved photos album: Error Domain=NSOSStatusErrorDomain Code=-12950 "Movie could not be played." UserInfo=0x5e4fb90 {NSLocalizedDescription=Movie could not be played.}

代码在另一个程序中没有问题。所以电影有什么问题......?

+0

我在一个音频文件和图片数组中创建了一个电影,在stackoverflow上写了一篇文章。也许有一部分代码可以帮助你。 http://stackoverflow.com/questions/6061092/make-movie-file-with-picture-array-and-song-file-using-avasset – TheRonin 2011-11-04 14:39:39

回答

-3

它接缝assetWriterAudioInput忽略音频写入的样本缓冲时间。 这样做。

1)写视频轨道。 2)完成后,将其标记为[videoWriterInput markAsFinished];

3)do [assetWriter startSessionAtSourceTime:timeRangeStart];

3)实例化音频阅读器并开始写音频。

+0

startSessionAtSourceTime用于资产编写器,不用于输入。使用它来管理输入是不可能的。 – AlexeyVMP 2014-03-13 18:36:29

9
-(void)mergeAudioVideo 
{ 

    NSString *videoOutputPath=[_documentsDirectory stringByAppendingPathComponent:@"dummy_video.mp4"]; 
    NSString *outputFilePath = [_documentsDirectory stringByAppendingPathComponent:@"final_video.mp4"]; 
    if ([[NSFileManager defaultManager]fileExistsAtPath:outputFilePath]) 
     [[NSFileManager defaultManager]removeItemAtPath:outputFilePath error:nil]; 


    NSURL *outputFileUrl = [NSURL fileURLWithPath:outputFilePath]; 
    NSString *filePath = [_documentsDirectory stringByAppendingPathComponent:@"newFile.m4a"]; 
    AVMutableComposition* mixComposition = [AVMutableComposition composition]; 

    NSURL *audio_inputFileUrl = [NSURL fileURLWithPath:filePath]; 
    NSURL *video_inputFileUrl = [NSURL fileURLWithPath:videoOutputPath]; 

    CMTime nextClipStartTime = kCMTimeZero; 

    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil]; 
    CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration); 

    AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; 
    [a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil]; 

    AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil]; 
    CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration); 
    AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; 
    [b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime error:nil]; 

    AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality]; 
    _assetExport.outputFileType = @"com.apple.quicktime-movie"; 
    _assetExport.outputURL = outputFileUrl; 

    [_assetExport exportAsynchronouslyWithCompletionHandler: 
    ^(void) { 
     if (_assetExport.status == AVAssetExportSessionStatusCompleted) { 

      //Write Code Here to Continue 
     } 
     else { 
      //Write Fail Code here  
     } 
    } 
    ]; 



} 

您可以使用此代码合并音频和视频。

相关问题