2013-04-28 74 views
1

我有一个奇怪的问题,使用AVAudioPlayer播放背景中的iPhone上的声音文件(wav文件)。我正在使用下面的代码:在后台应用程序播放声音(prepareToPlay返回NO)

 AVAudioPlayer* audioplayer; 
     NSError* error; 

     audioplayer = [[AVAudioPlayer alloc] initWithData:soundfile error:&error]; 
     if (error) { 
      NSLog(@"an error occured while init audioplayer..."); 
      NSLog(@"%@", [error localizedDescription]); 
     } 
     audioplayer.currentTime = 0; 
     if (![audioplayer prepareToPlay]) 
      NSLog(@"could not preparetoPlay"); 

     audioplayer.volume = 1.0; 

     [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil]; 
     [[AVAudioSession sharedInstance] setActive: YES error: &error]; 

     if (![audioplayer play]) 
      NSLog(@"could not play sound"); 

     audioplayer.delegate = [myApp sharedInstance]; 

这适用于罚款,而应用程序在前台。但是,将应用移至背景时[audioplayer prepareToPlay]返回NO。

这种情况发生在没有“应用播放音频”的情况下,并添加到“所需的背景模式”。有没有办法如何从[audioplayer prepareToPlay]获得更精确的错误报告?或者你有什么提示我做错了或忘了?

+0

在第一NSLog的,你看到的任何错误?我的意思是,音频播放器已成功初始化?你确定'soundFile'参数不是零吗? – 2013-04-28 18:54:12

+0

audioplayer inits没有任何错误。是的,soundFile不是零(使用调试器来检查这一点,并在前台使用与应用程序完全相同的文件)。 – itsame69 2013-04-28 19:25:55

+0

“AVAudioSession”的'error'对象的值是多少?你假设你的音频会话开始时没有检查错误。另外,你只是设置一次或每次播放音频?应该只需要在应用程序委托中拥有一次。 – iwasrobbed 2013-04-28 23:19:53

回答

1

在准备AVAudioPlayer实例之前,您需要初始化音频会话。理想情况下,将音频会话调用移至应用程序委托的didFinishLaunchingWithOptions方法。

+0

对不起。忘了提及我正在初始化音频会话:'AudioSessionInitialize(NULL,NULL,NULL,NULL); UInt32 sessionCategory = kAudioSessionCategory_MediaPlayback; AudioSessionSetProperty(kAudioSessionProperty_AudioCategory,sizeof(sessionCategory),&sessionCategory); AudioSessionSetActive(true);' – itsame69 2013-04-29 11:39:04

0

我不完全确定这可以单独使用AVFoundation来实现,您可能需要使用AudioUnit框架并创建一个流。将.WAV文件的内容发送到音频缓冲区应该相对简单。

这就是我一直在Piti Piti Pa做的。另一个好处是你可以更好地控制音频延迟,以同步音频和视频动画(使用蓝牙时更明显)。

下面是我使用初始化音频单元代码:(如果它们导出到RAW格式更容易)

+(BOOL)_createAudioUnitInstance 
{ 
    // Describe audio component 
    AudioComponentDescription desc; 
    desc.componentType = kAudioUnitType_Output; 
    desc.componentSubType = kAudioUnitSubType_RemoteIO; 
    desc.componentFlags = 0; 
    desc.componentFlagsMask = 0; 
    desc.componentManufacturer = kAudioUnitManufacturer_Apple; 
    AudioComponent inputComponent = AudioComponentFindNext(NULL, &desc); 

    // Get audio units 
    OSStatus status = AudioComponentInstanceNew(inputComponent, &_audioUnit); 
    [self _logStatus:status step:@"instantiate"]; 
    return (status == noErr); 
} 

+(BOOL)_setupAudioUnitOutput 
{ 
    UInt32 flag = 1; 
    OSStatus status = AudioUnitSetProperty(_audioUnit, 
           kAudioOutputUnitProperty_EnableIO, 
           kAudioUnitScope_Output, 
           _outputAudioBus, 
           &flag, 
           sizeof(flag)); 
    [self _logStatus:status step:@"set output bus"]; 
    return (status == noErr); 
} 

+(BOOL)_setupAudioUnitFormat 
{ 
    AudioStreamBasicDescription audioFormat = {0}; 
    audioFormat.mSampleRate   = 44100.00; 
    audioFormat.mFormatID   = kAudioFormatLinearPCM; 
    audioFormat.mFormatFlags  = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked; 
    audioFormat.mFramesPerPacket = 1; 
    audioFormat.mChannelsPerFrame = 2; 
    audioFormat.mBitsPerChannel  = 16; 
    audioFormat.mBytesPerPacket  = 4; 
    audioFormat.mBytesPerFrame  = 4; 

    OSStatus status = AudioUnitSetProperty(_audioUnit, 
             kAudioUnitProperty_StreamFormat, 
             kAudioUnitScope_Input, 
             _outputAudioBus, 
             &audioFormat, 
             sizeof(audioFormat)); 
    [self _logStatus:status step:@"set audio format"]; 
    return (status == noErr); 
} 


+(BOOL)_setupAudioUnitRenderCallback 
{ 
    AURenderCallbackStruct audioCallback; 
    audioCallback.inputProc = playbackCallback; 
    audioCallback.inputProcRefCon = (__bridge void *)(self); 
    OSStatus status = AudioUnitSetProperty(_audioUnit, 
             kAudioUnitProperty_SetRenderCallback, 
             kAudioUnitScope_Global, 
             _outputAudioBus, 
             &audioCallback, 
             sizeof(audioCallback)); 
    [self _logStatus:status step:@"set render callback"]; 
    return (status == noErr); 
} 


+(BOOL)_initializeAudioUnit 
{ 
    OSStatus status = AudioUnitInitialize(_audioUnit); 
    [self _logStatus:status step:@"initialize"]; 
    return (status == noErr); 
} 

+(void)start 
{ 
    [self clearFeeds]; 
    [self _startAudioUnit]; 
} 

+(void)stop 
{ 
    [self _stopAudioUnit]; 
} 

+(BOOL)_startAudioUnit 
{ 
    OSStatus status = AudioOutputUnitStart(_audioUnit); 
    [self _logStatus:status step:@"start"]; 
    return (status == noErr); 
} 

+(BOOL)_stopAudioUnit 
{ 
    OSStatus status = AudioOutputUnitStop(_audioUnit); 
    [self _logStatus:status step:@"stop"]; 
    return (status == noErr); 
} 

+(void)_logStatus:(OSStatus)status step:(NSString *)step 
{ 
    if(status != noErr) 
    { 
     NSLog(@"AudioUnit failed to %@, error: %d", step, (int)status); 
    } 
} 

#pragma mark - Mixer 

static OSStatus playbackCallback(void *inRefCon, 
          AudioUnitRenderActionFlags *ioActionFlags, 
          const AudioTimeStamp *inTimeStamp, 
          UInt32 inBusNumber, 
          UInt32 inNumberFrames, 
          AudioBufferList *ioData) { 

    @autoreleasepool { 
     AudioBuffer *audioBuffer = ioData->mBuffers; 

     _lastPushedFrame = _nextFrame; 
     [SIOAudioMixer _generateAudioFrames:inNumberFrames into:audioBuffer->mData]; 
    } 
    return noErr; 
} 

现在你只需要提取.wav文件的内容和发送通过回调传递给缓冲区。

我希望有帮助!

0

在AppDelegate中设置AVAudioSession类别如下:(SWIFT 2)

do { 
     try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, withOptions: AVAudioSessionCategoryOptions.MixWithOthers) 
    }catch{ 
     self.fireAnAlert("Set Category Failed", theMessage: "Failed to set AVAudioSession Category") 
    } 

设置选项为 “混合与他人” 是一个重要的棋子!

然后在任何你要玩的声音确保您拨打beginReceivingRemoteControlEvents,然后设置AVAudioSession主动这样的:

do{ 

     UIApplication.sharedApplication().beginReceivingRemoteControlEvents() 
     try AVAudioSession.sharedInstance().setActive(true) 

    }catch{ 

     let e = error as NSError 

     self.appDelegate?.fireAnAlert("Error", theMessage: "\(e.localizedDescription)") 
    } 
相关问题