试图使用here 代码我试图合并这两个类,以便我结束了ulaw音频数据的inputStream。所以我editted UlawEncoderInputStream像这样:UnsatisfiedLinkError AudioRecordNew
private MicrophoneInputStream micIn;
public UlawEncoderInputStream() {
mMax = 0;
try {
micIn = new MicrophoneInputStream(8000, 1);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
然后尝试使用UlawEncoderInputStream:
while(transmittingAudio) {
if(micInStream.available()>0) {
// byte[] data = new byte[audioDump.available()];
int bytesRead = micInStream.read(data);
os.write(data,0,bytesRead);
os.flush();
// ca.transmitAxisAudioPacket(data);
// System.out.println("read "+bytesRead);
}
然而,有似乎是在MicrophoneInputStream底部的JNI本地声明的一个问题。
private static native int AudioRecordNew(int sampleRate, int fifoDepth);
错误与此:
05-14 14:46:48.544: W/dalvikvm(28658): No implementation found for native Lcom/avispl/nicu/audio/MicrophoneInputStream;.AudioRecordNew (II)I
05-14 14:46:48.552: W/dalvikvm(28658): threadid=10: thread exiting with uncaught exception (group=0x40018560)
05-14 14:46:48.552: E/AndroidRuntime(28658): FATAL EXCEPTION: Thread-12
05-14 14:46:48.552: E/AndroidRuntime(28658): java.lang.UnsatisfiedLinkError: AudioRecordNew
05-14 14:46:48.552: E/AndroidRuntime(28658): at com.avispl.nicu.audio.MicrophoneInputStream.AudioRecordNew(Native Method)
05-14 14:46:48.552: E/AndroidRuntime(28658): at com.avispl.nicu.audio.MicrophoneInputStream.(MicrophoneInputStream.java:27)
05-14 14:46:48.552: E/AndroidRuntime(28658): at com.avispl.nicu.audio.UlawEncoderInputStream.(UlawEncoderInputStream.java:111)
05-14 14:46:48.552: E/AndroidRuntime(28658): at com.avispl.nicu.audio.AudioTransmitter.run(AudioTransmitter.java:66)
呃我真的觉得我只是不知道如何使用NDK,idk –