2015-07-03 35 views
0

我想要的:在AVVideoCompositionTrack时间0:00插入多个不透明度为ALL的视频图层。使用AVFoundation没有CALayer合成2个半透明不透明的视频变成附加视频?

我仔细阅读了官方AVFoundation文档以及许多WWDC关于此主题的讨论。但我不明白为什么结果不遵循API声明。

我可以在播放过程中用2 AVPlayerLayer实现重叠结果。这也意味着我可以使用AVVideoCompositionCoreAnimationTool在导出过程中实现类似的功能。 但我倾向于将CALayer留给字幕/图像叠加层或动画。

我想对于任何插入AVAsset什么:

- (void)addVideo:(AVAsset *)asset_in withOpacity:(float)opacity 
{ 
    // This is demo for composition of opaque videos. So we all insert video at time - 0:00 
    [_videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset_in.duration) 
            ofTrack:[ [asset_in tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0 ] 
            atTime:kCMTimeZero error:nil ]; 

    AVMutableVideoCompositionInstruction *mutableVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; 
    AVAssetTrack *assettrack_in = [ [asset_in tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0 ]; 
    mutableVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, assettrack_in.timeRange.duration); 
    AVMutableVideoCompositionLayerInstruction *videoCompositionLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:_videoCompositionTrack]; 
    [videoCompositionLayerInstruction setTransform:assettrack_in.preferredTransform atTime:kCMTimeZero]; 
    [videoCompositionLayerInstruction setOpacity:opacity atTime:kCMTimeZero]; 
    mutableVideoCompositionInstruction.layerInstructions = @[videoCompositionLayerInstruction]; 
    [_arrayVideoCompositionInstructions addObject:mutableVideoCompositionInstruction]; 
} 

请注意,insertTimeRange有atTime:kCMTimeZero作为参数。所以我希望他们会放在视频作品的开头。

我尝试导出:

- (IBAction)ExportAndPlay:(id)sender 
{ 
    _mutableVideoComposition.instructions = [_arrayVideoCompositionInstructions copy]; 

    // Create a static date formatter so we only have to initialize it once. 
    static NSDateFormatter *kDateFormatter; 
    if (!kDateFormatter) { 
     kDateFormatter = [[NSDateFormatter alloc] init]; 
     kDateFormatter.dateStyle = NSDateFormatterMediumStyle; 
     kDateFormatter.timeStyle = NSDateFormatterShortStyle; 
    } 
    // Create the export session with the composition and set the preset to the highest quality. 
    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:_mutableComposition presetName:AVAssetExportPresetHighestQuality]; 
    // Set the desired output URL for the file created by the export process. 
    exporter.outputURL = [[[[NSFileManager defaultManager] URLForDirectory:NSDocumentDirectory inDomain:NSUserDomainMask appropriateForURL:nil create:@YES error:nil] URLByAppendingPathComponent:[kDateFormatter stringFromDate:[NSDate date]]] URLByAppendingPathExtension:CFBridgingRelease(UTTypeCopyPreferredTagWithClass((CFStringRef)AVFileTypeQuickTimeMovie, kUTTagClassFilenameExtension))]; 
    // Set the output file type to be a QuickTime movie. 
    exporter.outputFileType = AVFileTypeQuickTimeMovie; 
    exporter.shouldOptimizeForNetworkUse = YES; 
    exporter.videoComposition = _mutableVideoComposition; 
    _mutableVideoComposition.instructions = [_arrayVideoCompositionInstructions copy]; 
    // Asynchronously export the composition to a video file and save this file to the camera roll once export completes. 
    [exporter exportAsynchronouslyWithCompletionHandler:^{ 
     dispatch_async(dispatch_get_main_queue(), ^{ 
      switch ([exporter status]) { 
       case AVAssetExportSessionStatusFailed: 
       { 
        NSLog(@"Export failed: %@ %@", [[exporter error] localizedDescription],[[exporter error]debugDescription]); 
       } 
       case AVAssetExportSessionStatusCancelled: 
       { 
        NSLog(@"Export canceled"); 
        break; 
       } 
       case AVAssetExportSessionStatusCompleted: 
       { 
        NSLog(@"Export complete!"); 
        NSLog(@"Export URL = %@", [exporter.outputURL absoluteString]); 
        [self altPlayWithUrl:exporter.outputURL]; 
       } 
       default: 
       { 
        NSLog(@"default"); 
       } 
      } 

     }); 
    }]; 
} 

什么证明:它导出视频与第一视频后追加第二个视频,如果我选择2个视频剪辑。

这不是从我读出相同的行为:AVMutableCompositionTrack

可能任何人一些启发这个无助的羔羊?

编辑:是否有任何细节丢失,以至于没有人可以借我一只手?如果是这样,请留下评论,以便我可以做出来。

回答

0

好的,很抱歉,因为对AVMutableCompositionTrack的API有些误解。

如果你想混合2视频作为2叠加像我一样。你将需要2种AVMutableCompositionTrack情况下,无论是从同一AVMutableComposition实例是这样的:

// 0. Setup AVMutableCompositionTracks <= FOR EACH AVAssets !!! 
AVMutableCompositionTrack *mutableCompositionVideoTrack1 = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; 
AVMutableCompositionTrack *mutableCompositionVideoTrack2 = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; 

并插入要两个AVAssets到自己AVMutableCompositionTrack:

AVAssetTrack *videoAssetTrack1 = [ [ [_arrayVideoAssets firstObject] tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0 ]; 
AVAssetTrack *videoAssetTrack2 = [ [ [_arrayVideoAssets lastObject] tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0 ]; 
    [ mutableCompositionVideoTrack1 insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAssetTrack1.timeRange.duration) ofTrack:videoAssetTrack1 atTime:kCMTimeZero error:nil ]; 
    [ mutableCompositionVideoTrack2 insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAssetTrack2.timeRange.duration) ofTrack:videoAssetTrack2 atTime:kCMTimeZero error:nil ]; 

然后设置与2 AVMutableVideoComposition每个AVMutableCompositionTracks的层指令:

AVMutableVideoCompositionInstruction *compInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; 
compInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAssetTrack1.timeRange.duration); 
AVMutableVideoCompositionLayerInstruction *layerInstruction1 = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:mutableCompositionVideoTrack1]; 
[layerInstruction1 setOpacity:0.5f atTime:kCMTimeZero]; 

AVMutableVideoCompositionLayerInstruction *layerInstruction2 = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:mutableCompositionVideoTrack2]; 
[layerInstruction2 setOpacity:0.8f atTime:kCMTimeZero]; 
CGAffineTransform transformScale = CGAffineTransformMakeScale(0.5f, 0.5f); 
CGAffineTransform transformTransition = CGAffineTransformMakeTranslation(videoComposition.renderSize.width/2, videoComposition.renderSize.height/2); 
[ layerInstruction2 setTransform:CGAffineTransformConcat(transformScale, transformTransition) atTime:kCMTimeZero ]; 
compInstruction.layerInstructions = @[ layerInstruction1, layerInstruction2 ]; 
videoComposition.instructions = @[ compInstruction ]; 

最后,在导出过程中应该没问题。 对不起,如果有任何看看。