I am working on an application that tries to display the video which is streaming on a webserver. I have the HTTP link for the video stream. To connect to the webserver I am connecting to a specific WIFI which enables me to access the video.
I have taken care of the WIFI connectivity. I am having a problem when I try to display the video on my app using VideoView. I used a tool which showed me that I was receiving data from the server. So my application is receiving the packets of data for the video but it is unable to display it.
The code is like this :
private VideoView videoView;
videoView = (VideoView)findViewById(R.id.VideoView_videoOnly);
....
....
//connect to the router
String ip = "http://192.168.2.250";
URL url = null;
try {
url = new URL(ip);
} catch (MalformedURLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
URLConnection connection = null;
try {
connection = url.openConnection();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
String vidLink = "http://XXXX:XXXX#192.168.2.250/axis-cgi/mjpg/video.cgi";
//set up video streaming
MediaController mc = new MediaController(this);
mc.setAnchorView(videoView);
mc.setMediaPlayer(videoView);
Uri video = Uri.parse(vidLink);
videoView.setMediaController(mc);
videoView.setVideoURI(video);
videoView.start();
}
}
XXXX:XXXX is the username and password to access the video stream
According to the URL you are using, one can guess, you are receiving the stream in the MJPEG format. MJPEG is not natively supported by Android.
You will have to break your stream into jpegs, then create a Bitmap from each jpeg, then display bitmaps one by one.
Related
I am developing an app which has the functionality of sharing screens with other apps.
I used the Media projection API for this. I also used MediaMuxer to combine the audio and video outputs for screen sharing.
I know that Media Projection APIs are used for screen recording but all I want is to share the screen while recording.
For this, I have modified the writeSampleData method of the MediaMuxer class to send bytes via a socket to the other device over the network.
Below is the code for that:
OutputStream outStream;
outStream = ScreenRecordingActivity.getInstance().socket.getOutputStream();
void writeSampleData(final int trackIndex, final ByteBuffer byteBuf, final MediaCodec.BufferInfo bufferInfo) {
if (mStatredCount > 0) {
mMediaMuxer.writeSampleData(trackIndex, byteBuf, bufferInfo);
if (bufferInfo.size != 0) {
byteBuf.position(bufferInfo.offset);
byteBuf.limit(bufferInfo.offset + bufferInfo.size);
if (outStream != null) {
try {
byte[] bytes = new byte[byteBuf.remaining()];
byteBuf.get(bytes);
//Send the data
outStream.write(bytes);
outStream.flush();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
}
The bytes are successfully transferred via socket and I am also able to receive these bytes at the receiver's end.
Below is the code for receiving bytes at the receiver's end:
private class SocketThread implements Runnable {
#Override
public void run() {
Socket socket;
try {
serverSocket = new ServerSocket(SERVER_PORT);
} catch (IOException e) {
e.printStackTrace();
}
if (null != serverSocket) {
while (!Thread.currentThread().isInterrupted()) {
try {
socket = serverSocket.accept();
CommunicationThread commThread = new CommunicationThread(socket);
new Thread(commThread).start();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
class CommunicationThread implements Runnable {
InputStream in;
DataInputStream dis;
public CommunicationThread(Socket clientSocket) {
updateMessage("Server Started...");
}
public void run() {
while (!Thread.currentThread().isInterrupted()) {
try {
byte[] data = new byte[512];
} catch (Exception e) {
e.printStackTrace();
try {
fos.close();
} catch (Exception e1) {
e1.printStackTrace();
}
}
}
}
}
}
I followed the following links for Screen Sharing:
Screen capture
screenrecorder
Screen recording with mediaProjection
I used some code from the above examples to make an app.
All I want to know is how to handle the bytes at the receiver. How do I format these bytes to play a live stream from the sender's side?
Am I following the correct approach for sending and receiving byte data?
Does MediaProjection allow one to stream the Screen while recording between applications?
Any help will be deeply appreciated.
Generally for streaming, including screen sharing, the audio and video tracks are not muxed. Instead, each video frame and audio sample is sent using a protocol like RTP/RTSP, in which each data chunk is wrapped with other things like timestamps.
You can take a look at spyadroid which is a good starting point for streaming audio and video over RTSP to a browser or VLC. It streams the camera and microphone but you can adapt it for your own use case.
If you want to go with sockets for the moment, you have to get rid of the MediaMuxer and send frames/samples directly from the Encoder output, appended with timestamps at least to synchronize the playback in the receiver side, after sending CSDs - assuming that you encode in h.264 format - data (SPS PPS aka csd-0 and csd-1 that you can get when the encoder format is changed) to the receiver Decoder, which you can configure with an output surface to render your stream.
Some extra links :
android-h264-stream-demo
RTMP Java Muxer for Android
RTSP
RTP
WebRTC
Hello I'm trying to stream video from camera by Google NearBy Connections.
the connection established and messaging works but when I send stream the server device surface view show the camera preview but the client receive nothing with no error this is how is stream:
server:
SurfaceView surface;
public void sendCam(){
try {
surface = findViewById(R.id.surface);
ParcelFileDescriptor[] payloadPipe = ParcelFileDescriptor.createPipe();
ParcelFileDescriptor readFD = payloadPipe[0];
ParcelFileDescriptor writeFD = payloadPipe[1];
mCamera = Camera.open();
MediaRecorder rec = new MediaRecorder();
mCamera.lock();
mCamera.unlock();
rec.setCamera(mCamera);
rec.setVideoSource(MediaRecorder.VideoSource.CAMERA);
rec.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
rec.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
rec.setPreviewDisplay(surface.getHolder().getSurface());
rec.setOutputFile(writeFD.getFileDescriptor());
send(Payload.fromStream(readFD));
rec.prepare();
rec.start();
} catch (Exception e) {
}
}
client receive:
#Override
protected void onReceive(Endpoint endpoint, Payload payload) {
if (payload.getType() == Payload.Type.STREAM) {
try {
MediaPlayer mMediaPlayer = new MediaPlayer();
// mMediaPlayer.setDataSource(payload.asStream().asParcelFileDescriptor().getFileDescriptor()); // did not work also
FileInputStream inputStream = new FileInputStream(payload.asStream().asParcelFileDescriptor().getFileDescriptor());
mMediaPlayer.setDataSource(inputStream.getFD());
mMediaPlayer.setDisplay(surface.getHolder());
mMediaPlayer.prepare();
mMediaPlayer.start();
} catch (IOException e) {
e.printStackTrace();
}
}
}
I'm not even sure if i can stream video using nearby, the sample projects show how to stream audio. is this api not able to stream video or there is a problem with my code and what it could be.
UPDATE:
} catch (IOException e) { GIVES ME setDataSource failed.: status=0x80000000
There's nothing in Nearby Connections that precludes you from sending video streams -- if yours is a 1:1 scenario (one client device connected to one server device), it would be better to use the P2P_STAR Strategy (if you aren't already) since it provides you a higher bandwidth connection (which should help with video).
To make sure that you're at least receiving the stream's raw bytes on the client, you can check whether the onPayloadTransferUpdate() callback is firing.
If not, then you should check that same callback on the server, to see whether or not the bytes are being sent out.
If no bytes are being sent out from the server, then it might be a problem with your application-level video capture code.
Good luck!
I am trying to stream to my Android Application from an IP Webcam over RTSP, the stream plays successfully via the VLC Android app, but I cannot replicate this in my own application.
if (mediaControls == null)
{
mediaControls = new MediaController(VideoActivity.this);
}
myVideoView = (VideoView) findViewById(R.id.stream);
myVideoView.setOnErrorListener(this);
progressDialog = new ProgressDialog(VideoActivity.this);
progressDialog.setTitle("Camera Stream");
progressDialog.setMessage("Loading...");
progressDialog.setCancelable(false);
//progressDialog.show();
try
{
myVideoView.setMediaController(mediaControls);
myVideoView.setVideoURI(Uri.parse(videoSrc));
}
catch (Exception e)
{
Log.e("Error", e.getMessage());
e.printStackTrace();
}
myVideoView.requestFocus();
myVideoView.setOnPreparedListener(new OnPreparedListener()
{
// Close the progress bar and play the video
public void onPrepared(MediaPlayer mp)
{
//progressDialog.dismiss();
myVideoView.seekTo(position);
if (position == 0)
{
myVideoView.start();
}
else
{
myVideoView.pause();
}
}
});
The activity prompts a "Cannot Play This Video" dialog. According to the Desktop VLC client the stream is MJPG, 320x256, displaying at 320x240, in the decoded format "Planar 4:2:0 YUV full scale." According to the camera's data sheet it is at 30fps.
I have been able to play other RTSP streams with this code, but not the one I need.
I have been trying to figure out how to use libVLC but haven't had much luck so far. Can anyone help me to get this stream into something my application can use?
I'm trying to play an rtsp stream using MediaPlayer in android and the application seems to always become stuck on MediaPlayer.prepare();
The url is valid as I tested it using VLC on my desktop.
Any ideas why the application is not preparing the stream.
class InitializeService extends Thread {
#Override
public void run() {
try {
player.prepare();
Log.d("Play", "Player prepared");
} catch (IOException e) {
e.printStackTrace();
fallback();
} catch (IllegalStateException e) {
e.printStackTrace();
fallback();
}
}
}
The log statement is never reached.
Update 1:
Sorry I forgot to mention that the stream will always be in 3gp format. Here is a url rtsp://r2---sn-p5qlsu76.c.youtube.com/CiILENy73wIaGQnTXOVs7Kwo8xMYESARFEgGUgZ2aWRlb3MM/0/0/0/video.3gp
Your stream might not be of a format supported by Android.
Check http://developer.android.com/guide/appendix/media-formats.html to see if Android supports it.
Turns out it was android l that wasn't able to play the streams.
I have online radio (shout cast ) in web. I want to develop android app for listen this stream. so I want to know how to play online stream in android. using URL. In Android Im not going to Stream the audio. I want listen the web stream from android app.
How to do that..?
thanks
Something like
private void init() throws IOException {
try {
mediaPlayer = new MediaPlayer();
String streamPath = "";//enter path
mediaPlayer.setDataSource(streamPath);
mediaPlayer.prepare();
} catch (MalformedURLException ex) {
throw new RuntimeException("Wrong url for mediaplayer! " + ex);
} catch (IllegalStateException ex) {
} catch (IllegalArgumentException ex) {
throw new RuntimeException("Wrong url for mediaplayer! " + ex);
}
}
private void play() {
mediaPlayer.start();
}
Please refer the link it may help you a while, http://developer.android.com/guide/topics/media/index.html