我正尝试通过WiFi将音频从麦克风从1 Android传输到另一个Android。 在看了一些例子之后,我在每个应用中做了一个单独的活动,1个捕获并发送音频,另一个接收。通过WiFi在Android手机之间传输语音
我已经使用Audiorecord和Audiotrack类来捕捉和播放。然而,我只是听到一些crack啪声(现在停止后,我做了一些改变,虽然我恢复了回来)
活动发送语音。
public class VoiceSenderActivity extends Activity {
private EditText target;
private TextView streamingLabel;
private Button startButton,stopButton;
public byte[] buffer;
public static DatagramSocket socket;
private int port=50005; //which port??
AudioRecord recorder;
//Audio Configuration.
private int sampleRate = 8000; //How much will be ideal?
private int channelConfig = AudioFormat.CHANNEL_CONFIGURATION_MONO;
private int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
private boolean status = true;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
target = (EditText) findViewById (R.id.target_IP);
streamingLabel = (TextView) findViewById(R.id.streaming_label);
startButton = (Button) findViewById (R.id.start_button);
stopButton = (Button) findViewById (R.id.stop_button);
streamingLabel.setText("Press Start! to begin");
startButton.setOnClickListener (startListener);
stopButton.setOnClickListener (stopListener);
}
private final OnClickListener stopListener = new OnClickListener() {
@Override
public void onClick(View arg0) {
status = false;
recorder.release();
Log.d("VS","Recorder released");
}
};
private final OnClickListener startListener = new OnClickListener() {
@Override
public void onClick(View arg0) {
status = true;
startStreaming();
}
};
public void startStreaming() {
Thread streamThread = new Thread(new Runnable() {
@Override
public void run() {
try {
int minBufSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat);
DatagramSocket socket = new DatagramSocket();
Log.d("VS", "Socket Created");
byte[] buffer = new byte[minBufSize];
Log.d("VS","Buffer created of size " + minBufSize);
DatagramPacket packet;
final InetAddress destination = InetAddress.getByName(target.getText().toString());
Log.d("VS", "Address retrieved");
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,sampleRate,channelConfig,audioFormat,minBufSize);
Log.d("VS", "Recorder initialized");
recorder.startRecording();
while(status == true) {
//reading data from MIC into buffer
minBufSize = recorder.read(buffer, 0, buffer.length);
//putting buffer in the packet
packet = new DatagramPacket (buffer,buffer.length,destination,port);
socket.send(packet);
}
} catch(UnknownHostException e) {
Log.e("VS", "UnknownHostException");
} catch (IOException e) {
Log.e("VS", "IOException");
}
}
});
streamThread.start();
}
}
活动接收语音
public class VoiceReceiverActivity extends Activity {
private Button receiveButton,stopButton;
public static DatagramSocket socket;
private AudioTrack speaker;
//Audio Configuration.
private int sampleRate = 8000; //How much will be ideal?
private int channelConfig = AudioFormat.CHANNEL_CONFIGURATION_MONO;
private int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
private boolean status = true;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
receiveButton = (Button) findViewById (R.id.receive_button);
stopButton = (Button) findViewById (R.id.stop_button);
findViewById(R.id.receive_label);
receiveButton.setOnClickListener(receiveListener);
stopButton.setOnClickListener(stopListener);
}
private final OnClickListener stopListener = new OnClickListener() {
@Override
public void onClick(View v) {
status = false;
speaker.release();
Log.d("VR","Speaker released");
}
};
private final OnClickListener receiveListener = new OnClickListener() {
@Override
public void onClick(View arg0) {
status = true;
startReceiving();
}
};
public void startReceiving() {
Thread receiveThread = new Thread (new Runnable() {
@Override
public void run() {
try {
DatagramSocket socket = new DatagramSocket(50005);
Log.d("VR", "Socket Created");
byte[] buffer = new byte[256];
//minimum buffer size. need to be careful. might cause problems. try setting manually if any problems faced
int minBufSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat);
speaker = new AudioTrack(AudioManager.STREAM_MUSIC,sampleRate,channelConfig,audioFormat,minBufSize,AudioTrack.MODE_STREAM);
speaker.play();
while(status == true) {
try {
DatagramPacket packet = new DatagramPacket(buffer,buffer.length);
socket.receive(packet);
Log.d("VR", "Packet Received");
//reading content from packet
buffer=packet.getData();
Log.d("VR", "Packet data read into buffer");
//sending data to the Audiotrack obj i.e. speaker
speaker.write(buffer, 0, minBufSize);
Log.d("VR", "Writing buffer content to speaker");
} catch(IOException e) {
Log.e("VR","IOException");
}
}
} catch (SocketException e) {
Log.e("VR", "SocketException");
}
}
});
receiveThread.start();
}
}
我使用Wireshark来检查是否正在发送的数据包,我可以看到的数据包。但是,源是发送设备的MAC地址,目标也是物理地址。不知道这是否相关。
那么问题是什么?
至少有三个问题需要处理:数据延迟(或丢失),总体数据吞吐量以及稍微不匹配采样频率的可能性。实际的IP电话必须具备处理这三者的手段。不匹配的时钟是非常棘手的 - 最初你可以引入一个延迟来给出一些缓冲允许,但是如果发送者速度较慢,你将用尽缓冲区,并且接收器将会缺乏数据;而如果发送者速度更快,则缓冲区最终会随未播放数据溢出。 – 2012-12-01 17:32:50
我确实设法实现了这一点。没有真正存在不匹配频率的问题。数据延迟,是的。有一种我自己的协议来匹配接收器/发送器时钟。最后,它确实工作,但只有一些滞后(随着距离无线路由器的增加而增加) – Alabhya 2012-12-02 18:37:41
嘿,那里,我为上面的代码实现了一个测试应用程序,在下面提出了所有必要的修改,但我仍然有问题。两个手机之间的通讯没有问题,但我认为麦克风不能正常录音,因为我听不到另一端的声音。您是否可以链接到我可以看一下的示例解决方案? – chuckliddell0 2013-03-11 14:48:16