2017-07-25 34 views
0

我想在swift 3中编写一个应用程序,以便播放排队的音频文件时,从一个传递到另一个时没有任何间隙,裂缝或噪音。Mac - Swift 3 - 排队音频文件和播放

我的第一次尝试是使用AvAudioPlayer和AvAudioDelegate(AVAudioPlayer using array to queue audio files - Swift),但我不知道如何预加载下一首歌以避免出现间隙。即使我知道如何去做,我也不确定这是实现我的目标的最好方法。 AVQueuePlayer似乎是一个更好的候选人的工作,它是为此目的,但我没有找到任何示例来帮助我。 也许这只是一个预加载或缓冲的问题?我有点迷失在这种可能性的海洋中。

欢迎任何建议。

+0

除非你顺利淡入淡出音频文件,总是会有裂纹,因为波形不会seamlesssly匹配,而“跳”对应于高频率的频谱的感知声音。当然,如果音频文件对应于最初的一个不间断声音的单个片段,则应该很好... –

+0

它们是将MIDI合成转换为音频文件的样本。它们都以“0”开始,并以“0”结束。通过扩大搜索范围,我找到了一个完美的解决方案。它基于这篇文章:https://stackoverflow.com/questions/30479403/concatenate-two-audio-files-in-swift-and-play-them?rq=1。我很快就会发布代码。 – Fredo

回答

0

它是非常完美的,特别是如果你想做两次或更多(“文件存在”错误),但它可以作为一个基地。

它所做的是取两个文件(地雷是4秒的样本),将它们编码在一个文件中并播放结果文件。如果你有成百上千的人,不管是不是偶然的,它都会变得非常有趣。

mergeAudioFiles函数的所有功劳归于@Peyman和@ Pigeon_39。 Concatenate two audio files in Swift and play them

斯威夫特3

import Cocoa 
import AVFoundation 

var action = AVAudioPlayer() 
let path = Bundle.main.path(forResource: "audiofile1.aif", ofType:nil)! 
let url = URL(fileURLWithPath: path) 
let path2 = Bundle.main.path(forResource: "audiofile2.aif", ofType:nil)! 
let url2 = URL(fileURLWithPath: path2) 
let array1 = NSMutableArray(array: [url, url2]) 


class ViewController: NSViewController, AVAudioPlayerDelegate 
{ 

    @IBOutlet weak var LanceStop: NSButton! 

    override func viewDidLoad() 
    { 
     super.viewDidLoad() 
    } 
    override var representedObject: Any? 
    { 
     didSet 
     { 
     // Update the view, if already loaded. 
     } 
    } 

    @IBAction func Lancer(_ sender: NSButton) 
    { 
     mergeAudioFiles(audioFileUrls: array1) 
     let url3 = NSURL(string: "/Users/ADDUSERNAMEHERE/Documents/FinalAudio.m4a") 

     do 
     { 
      action = try AVAudioPlayer(contentsOf: url3 as! URL) 
      action.delegate = self 
      action.numberOfLoops = 0 
      action.prepareToPlay() 
      action.volume = 1 
      action.play() 
     } 
     catch{print("error")} 

    } 


    func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) 
    { 
     if flag == true 
     { 

     } 
    } 

    var mergeAudioURL = NSURL() 

    func mergeAudioFiles(audioFileUrls: NSArray) { 
     //audioFileUrls.adding(url) 
     //audioFileUrls.adding(url2) 
     let composition = AVMutableComposition() 

     for i in 0 ..< audioFileUrls.count { 

      let compositionAudioTrack :AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID()) 

      let asset = AVURLAsset(url: (audioFileUrls[i] as! NSURL) as URL) 

      let track = asset.tracks(withMediaType: AVMediaTypeAudio)[0] 

      let timeRange = CMTimeRange(start: CMTimeMake(0, 600), duration: track.timeRange.duration) 

      try! compositionAudioTrack.insertTimeRange(timeRange, of: track, at: composition.duration) 
     } 

     let documentDirectoryURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first! as NSURL 
     self.mergeAudioURL = documentDirectoryURL.appendingPathComponent("FinalAudio.m4a")! as URL as NSURL 

     let assetExport = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetAppleM4A) 
     assetExport?.outputFileType = AVFileTypeAppleM4A 
     assetExport?.outputURL = mergeAudioURL as URL 
     assetExport?.exportAsynchronously(completionHandler: 
      { 
       switch assetExport!.status 
       { 
       case AVAssetExportSessionStatus.failed: 
        print("failed \(assetExport?.error)") 
       case AVAssetExportSessionStatus.cancelled: 
        print("cancelled \(assetExport?.error)") 
       case AVAssetExportSessionStatus.unknown: 
        print("unknown\(assetExport?.error)") 
       case AVAssetExportSessionStatus.waiting: 
        print("waiting\(assetExport?.error)") 
       case AVAssetExportSessionStatus.exporting: 
        print("exporting\(assetExport?.error)") 
       default: 
        print("Audio Concatenation Complete") 
       } 
     }) 
    } 
}