我无法解释iOS中remoteIO audiounit回调的行为。我正在设置一个带有两个回调的remoteIO单元,一个用于输入回调,另一个用于“渲染”回调。我正在按照与this tasty pixel教程中推荐的类似的方式来执行非常类似的remoteIO设置。这是相当长度的设置方法:核心音频 - 远程IO混淆
- (void)setup {
AudioUnit ioUnit;
AudioComponentDescription audioCompDesc;
audioCompDesc.componentType = kAudioUnitType_Output;
audioCompDesc.componentSubType = kAudioUnitSubType_RemoteIO;
audioCompDesc.componentManufacturer = kAudioUnitManufacturer_Apple;
audioCompDesc.componentFlags = 0;
audioCompDesc.componentFlagsMask = 0;
AudioComponent rioComponent = AudioComponentFindNext(NULL, &audioCompDesc);
CheckError(AudioComponentInstanceNew(rioComponent, &ioUnit), "Couldn't get RIO unit instance");
// i/o
UInt32 oneFlag = 1;
CheckError(AudioUnitSetProperty(ioUnit,
kAudioOutputUnitProperty_EnableIO,
kAudioUnitScope_Output,
kOutputBus,
&oneFlag,
sizeof(oneFlag)), "Couldn't enable RIO output");
CheckError(AudioUnitSetProperty(ioUnit,
kAudioOutputUnitProperty_EnableIO,
kAudioUnitScope_Input,
kInputBus,
&oneFlag,
sizeof(oneFlag)), "Couldn't enable RIO input");
AudioStreamBasicDescription myASBD;
memset (&myASBD, 0, sizeof(myASBD));
myASBD.mSampleRate = 44100;
myASBD.mFormatID = kAudioFormatLinearPCM;
myASBD.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
myASBD.mFramesPerPacket = 1;
myASBD.mChannelsPerFrame = 1;
myASBD.mBitsPerChannel = 16;
myASBD.mBytesPerPacket = 2 * myASBD.mChannelsPerFrame;
myASBD.mBytesPerFrame = 2 * myASBD.mChannelsPerFrame;
// set stream format for both busses
CheckError(AudioUnitSetProperty(ioUnit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Input,
kOutputBus,
&myASBD,
sizeof(myASBD)), "Couldn't set ASBD for RIO on input scope/bus 0");
CheckError(AudioUnitSetProperty(ioUnit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Output,
kInputBus,
&myASBD,
sizeof(myASBD)), "Couldn't set ASBD for RIO on output scope/bus 1");
// set arbitrarily high for now
UInt32 bufferSizeBytes = 10000 * sizeof(int);
int offset = offsetof(AudioBufferList, mBuffers[0]);
int bufferListSizeInBytes = offset + (sizeof(AudioBuffer) * myASBD.mChannelsPerFrame);
// why need to cast to audioBufferList * ?
self.inputBuffer = (AudioBufferList *)malloc(bufferListSizeInBytes);
self.inputBuffer->mNumberBuffers = myASBD.mChannelsPerFrame;
for (UInt32 i = 0; i < myASBD.mChannelsPerFrame; i++) {
self.inputBuffer->mBuffers[i].mNumberChannels = 1;
self.inputBuffer->mBuffers[i].mDataByteSize = bufferSizeBytes;
self.inputBuffer->mBuffers[i].mData = malloc(bufferSizeBytes);
}
self.remoteIOUnit = ioUnit;
/////////////////////////////////////////////// callback setup
AURenderCallbackStruct callbackStruct;
callbackStruct.inputProc = inputCallback;
callbackStruct.inputProcRefCon = (__bridge void * _Nullable)self;
CheckError(AudioUnitSetProperty(ioUnit,
kAudioOutputUnitProperty_SetInputCallback,
kAudioUnitScope_Global,
kInputBus,
&callbackStruct,
sizeof(callbackStruct)), "Couldn't set input callback");
AURenderCallbackStruct callbackStruct2;
callbackStruct2.inputProc = playbackCallback;
callbackStruct2.inputProcRefCon = (__bridge void * _Nullable)self;
CheckError(AudioUnitSetProperty(ioUnit,
kAudioUnitProperty_SetRenderCallback,
kAudioUnitScope_Global,
kOutputBus,
&callbackStruct,
sizeof(callbackStruct)), "Couldn't set input callback");
CheckError(AudioUnitInitialize(ioUnit), "Couldn't initialize input unit");
CheckError(AudioOutputUnitStart(ioUnit), "AudioOutputUnitStart failed");
}
我在回调中遇到奇怪的行为。首先,playbackCallback
函数完全不会被调用,尽管它的属性设置与教程中的属性相同(本教程由编写Loopy应用程序的人员编写)。
其次,输入回调函数有一个ioData(audioBufferList)参数,该参数应该为null(根据文档),但是会在null之间翻转并且在每个第二个回调中都有非零值。这对任何人都有意义吗?
此外,在输入回调中调用audiounitRender
(其中的语义我仍然不了解API逻辑和生命周期等),导致-50错误,这是非常通用的“坏参数”。这很可能是由于audiobufferlist
的无效“拓扑结构”,即交织/去交织,通道数量等等。但是,我尝试了各种拓扑,并且没有导致任何错误。而且这也不能解释奇怪的ioData行为。下面是引用的函数:
OSStatus inputCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList * ioData)
{
MicController *myRefCon = (__bridge MicController *)inRefCon;
CheckError(AudioUnitRender(myRefCon.remoteIOUnit,
ioActionFlags,
inTimeStamp,
inBusNumber,
inNumberFrames,
myRefCon.inputBuffer), "audio unit render");
return noErr;
}
我相信我的经验可能是由于格式或可能使用了错误的范围错车或其他一些琐碎的(易使在核心一些简单的错误音频上下文错误)。然而,因为我基本上没有语义和生命周期流程的直觉(方案?我甚至不知道要使用什么单词),所以我无法充分调试它。我非常感谢一位更有经验的核心音频程序员提供的帮助,这可能会对这种情况有所了解。
您的kInputBus和kOutputBus分配了什么值?您是否将kAudioUnitProperty_ShouldAllocateBuffer属性设置为任何值?在启动RemoteIO之前,您设置了哪些AudioSession或AVAudioSession类别?你是在新的iPhone 6s或6s +还是旧版设备上测试它? – hotpaw2
'kOutputBus'为零0对方的1.我没有设置'... shouldAllocateBuffer',并没有看到我所见过的remoteIO代码所需要的。我也不知道你需要一个非默认会话。我没有从AVFoundation获取默认的AudioSession,并将类别设置为“PlayAndRecord”,但这没有帮助。我正在iphone 6+上测试这个。 –
当为RemoteIO分配自己的AudioBufferList时,我的应用程序将shouldAllocateBuffer设置为false。你有没有激活你的AVAudioSession? – hotpaw2