我正在制作一个简单的android应用程序,它将摄像头的视频源流到基于自定义java的服务器,我使用tcp发送帧(基本上是字节[]),[我知道tcp不是视频流的好方法,但对我来说可以],android是客户端,java应用程序是服务器,我已经成功实现了客户端,但是我在实现java应用程序时遇到问题,服务器必须接收字节[],将它们转换为图像并显示在一些图像容器,这是服务器的源代码:Android的流摄像头饲料到基于java的服务器
package dummyserver;
import java.awt.Image;
import java.awt.image.BufferedImage;
import java.io.ByteArrayInputStream;
import java.io.DataInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.net.ServerSocket;
import java.net.Socket;
import javax.imageio.ImageIO;
import javax.media.jai.PlanarImage;
/**
*
* @author usama
*/
public class ServerListen implements Runnable{
CameraFeed camera_feed_ui;
ServerSocket server_socket=null;
Socket client_socket=null;
InputStream in=null;
int port=9000;
DataInputStream dis=null;
public ServerListen() {}
public ServerListen(CameraFeed camera_feed_ui)
{
this.camera_feed_ui = camera_feed_ui;
}
public void run() {
int len = 0;
byte[] data;
while(true)
{
try {
System.out.println("Waiting");
server_socket = new ServerSocket(port);
client_socket=server_socket.accept();
System.out.println("Client arrived");
System.out.println("Reading Image");
in=client_socket.getInputStream();
data=new byte[client_socket.getReceiveBufferSize()];
in.read(data, 0, client_socket.getReceiveBufferSize());
Image image = getImageFromByteArray(data);
BufferedImage bufferedImage = new BufferedImage(image.getWidth(null), image.getHeight(null), BufferedImage.TYPE_INT_RGB);
PlanarImage planar_image=PlanarImage.wrapRenderedImage(bufferedImage);
System.out.println("Converting byte[] to Image completed");
camera_feed_ui.displayImage(planar_image);
// client_socket.close();
server_socket.close();
}
catch(Exception ex)
{
System.out.println("error: " + ex.toString());
}
}
}
public static Image getImageFromByteArray(byte[] byteArray) {
InputStream is = new ByteArrayInputStream(byteArray);
try {
return ImageIO.read(is);
} catch (IOException ex) {
System.out.println("Unable to converet byte[] into image.");
return null;
}
}
}
的代码说明: CameraFeed是我的JFrame的对象并将其基本上包含的图像C要在其上显示视频的显示器(用于显示我正在使用的视频jai [Java advance imaging])。 displayImage(PlanarImage)方法只是将图像显示在容器中。我认为问题在于将byte []转换为图像,或者我没有从套接字中正确提取byte [],现在在输出中我得到一个黑色图像。另外一件事,在客户端,我为每一帧都建立了tcp连接,从这段代码中我也很清楚,我在收到帧之后关闭了连接(server_socket.close()),这是一个好方法?我怎样才能使这个流媒体高效?如果你可以请描述适当的方法从android手机到服务器的视频流(我问的算法)。
在此先感谢
问候
乌萨马·本
编辑:
C#代码:
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Windows.Forms;
using System.Net.Sockets;
using System.Threading;
namespace VideoServer
{
public partial class Form1 : Form
{
TcpListener server_socket;
Socket client_socket;
Thread video_thread;
NetworkStream ns;
public Form1()
{
InitializeComponent();
server_socket = null;
client_socket = null;
}
private void startVideoConferencing()
{
try
{
server_socket = new TcpListener(System.Net.IPAddress.Parse("192.168.15.153"),9000);
server_socket.Start();
client_socket = server_socket.AcceptSocket();
ns = new NetworkStream(client_socket);
pictureBoxVideo.Image = Image.FromStream(ns);
server_socket.Stop();
if (client_socket.Connected == true)
{
while (true)
{
startVideoConferencing();
}
ns.Flush();
}
}
catch (Exception ex)
{
button1.Enabled = true;
video_thread.Abort();
}
}
private void button1_Click(object sender, EventArgs e)
{
button1.Enabled = false;
video_thread = new Thread(new ThreadStart(startVideoConferencing));
video_thread.Start();
}
}
}
Basicall问题是,在java中,如果在java中有任何这样的方法只需要一个流并将其转换为图像,同时抽象转换的低级细节,那么Java中的Image.FromStream的等效也相当于此。那么任何想法?
您可以发布您的C#实现吗? –
我发布了C#代码.. – uyaseen
我有同样的问题,你会得到任何成功 –