Unable to play H.264 RTP stream in Android - android

I'm currently trying to get my Android playing a H.264 encoded video stream. I pass the URI of my RTSP server (and the respective media layer) to a VideoView instance and in Wireshark, I see the following:
Sauce Destination Protocol Info
192.168.1.104 192.168.1.91 RTSP DESCRIBE rtsp://192.168.1.91:554/mytransmitteroutput1
192.168.1.91 192.168.1.104 RTSP Reply: RTSP/1.0 200 OK [SDP]
Funny thing is, the media player never gets further than this, never calling a SETUP etc. It just stops, on the VideoView "Can't play this video" appears and in Eclipse, I see the generic error Error (1, -247483648) so I assume the Android media player isn't happy with the SDP I'm sending back after the DESCRIBE. However, players like VLC or MXPlayer are completely satisfied and able to decode the stream.
Media format parameters in SDP
fmtp:96 profile-level-id=42C016; packetization-mode=1; sprop-parameter-sets=Z0KAHpZSgWh7IA==,aMkjUg==

I'm stupid. H.264 was encoded using high profile. According to this page https://developer.android.com/guide/appendix/media-formats.html, Android only supports the base profile.

Related

WebView getVideo/Audio tracks or MediaStream and send it to the server via WebRTC

I have to get video/audio tracks or if it's possible MediaStream object from the Android WebView which plays HLS stream ("m3m8").
I load HLS stream in the WebView using method loadUrl("...m3m8"). It's running without issues, but i can't figure out how to get live video and audio tracks from the HLS stream. I read a lot of articles and I was not able to find any solution. So my question is - Is it possible to get audio and video tracks from the running HLS stream on the WebView? I need to get the audio and video tracks because I should send them via PeerConnection(WebRTC) which accepts MediaStream or audio tracks and video tracks. Any ideas and examples will be helpful. Thank you in advance.
HLS works by defining a playlist (your .m3u8 file) which points to chunks of the video (say segment00000.ts, segment00001.ts, ...). It can contain different streams for different resolutions, but the idea (very simplified) is that your player can download a chunk, play it right away, and download the next one in the meantime.
A WebRTC video track takes RTP packets. I don't know exactly what the Android API exposes, but it feels like either you can pass it an m3u8 as a source (though that may be a bit weird for a generic API), or you can't, and you have to read the HLS stream yourself, extract RTP frames, and pass them to WebRTC.
For instance, gstreamer's hlsdemux seems to be able to read from an HLS source. From there you could extract RTP frames and feed them to your WebRTC track.
Probably that's a bit lower-level than you hoped, but if the Android API doesn't do it for you, you have to do it yourself (or find a library that does it).

Can flowplayer handle rtsp stream?

Flowplayer can play rtmp and http live stream but can i use the same player to play rtsp stream. I have rtsp stream for android which can be played using external player but it opens in fullscreen mode. I thought of putting it inside a frame but the external player opens outside of the frame in android device. So i want to use flowplayer to play rtsp stream in android. Is it possible and if not what to use.
I am fairly certain that Flowplayer, while a great solution for many things, cannot be extended to accept a straight RTSP stream. In any case, I don't believe there is a supported mobile version or plugin of Flowplayer for Android at this point. I have even seen reports that embedded flowplayers being viewed on Android have been sketchy at best.
I have, however, used ffserver and ffmpeg (http://ffmpeg.org/) to transcode the RTSP stream into .flv to be played with Flowplayer, but if a transcoded stream could be broadcast on your system, you'd be well on your way!
Mason

Problem playing media from RTSP URL in Android

I have made a Android streaming application that plays media from online URL's. For playing the media, I am using the
standard MediaPlayer class for playing media.
As per the Android documentation, it supports RTSP protocol for audio & video playback
http://developer.android.com/guide/appendix/media-formats.html
But when I am trying to play media from a RTSP URL, it gets connected but I am not able to hear any media
Following is one of those RTSP URL -
rtsp://sfera.live24.gr/sfera4132
Does anybody media have an idea of playing RTSP URL's through the Android MediaPlayer
Thanks
That link you provided has 3 audio tracks, with the first and last tracks appearing to be silent and don't contain any valid audio.
The middle track has audio (as per VLC). I don't know how Android deals with multiple audio tracks. I imagine you may get better results if you use links that only contain 1 audio and 1 video track at most.
I expect for an rtsp stream with multiple audio tracks, android is only going to play the first one as there is no user interface to select a specific audio stream, hence why you aren't hearing any audio.
If this is a stream from your own server, to hear the audio you should adjust the SDP file the valid audio track first. If this is not from your server, I don't know what you're options are.

Android Mediaplayer :: How to detect streaming content type (audio or video)

I am completely new to displaying streaming either audio or video content using media player.
Somehow using post available here, i am able to display RTSP content(.3gp video) in my MediaPlayer implementation.
How to identify streaming content is audio only stream or audio/video stream using MediaPlayer class or streaming link?
I could be wrong here, but I believe there is only a MediaPLayer.OnInfoListner API available in Java to get information about the content stream being played. Not sure of how helpful that API actually is though. You might also want to try stream scrapers(is what I believe they are called) to get stream data and see if there is both audio and video channel to make a determination.

Android VideoView save RTSP stream

I'm playing on my Android Nexus One some videos of a few cisco cameras using a VideoView. While this works fine, I'm unsure if it's possible to save the movie to a file.
I'm opening an URL like rtsp://192.168.1.22:554/live.sdp
How can I save it to the SDcard ? Handle it like a file maybe ... Is that possible ?
You can implement/use library, your own RTSP client which will pipe the incoming RTP packets into a file.
if you want to also play the video stream you can then give the media player a local RTSP server address to your RTSP server and then pipe the same RTP packets to the media player as well.
If you need you can find an open source RTSP server/client here

Categories

Resources