I need to stream RTSP links within a VideoView, and in the case of RTSP links with a .mov output such as rtsp://184.72.239.149/vod/mp4:BigBuckBunny_115k.mov work fine. However, this one RTSP link I got from someone confidential has a h264 output according to what VLC Player says.
TL;DR how do you implement the streaming of these type of RTSP links and if there's no clear way to code it, are there any external libraries for Android Studio that easily this because I'm kinda at my loss here.
EDIT: Changed title. Streaming can be interpreted differently as in sending RTSP videos from your Android. That's NOT what I want to accomplish. A lot of examples on GitHub are heavily focused on the SD Card Storage and sending it outward, but I am still looking a way to play RTSP videos with h.264 output in my application.
Try to use MediaPlayerSDK from VXG. That is the only one that seems to be working and is open source.
https://github.com/VideoExpertsGroup/MediaPlayerSDK
However as per some posts this can be done using libFFMPEG as well. I didnt try it, but you can give it a try.
Related
I have to get video/audio tracks or if it's possible MediaStream object from the Android WebView which plays HLS stream ("m3m8").
I load HLS stream in the WebView using method loadUrl("...m3m8"). It's running without issues, but i can't figure out how to get live video and audio tracks from the HLS stream. I read a lot of articles and I was not able to find any solution. So my question is - Is it possible to get audio and video tracks from the running HLS stream on the WebView? I need to get the audio and video tracks because I should send them via PeerConnection(WebRTC) which accepts MediaStream or audio tracks and video tracks. Any ideas and examples will be helpful. Thank you in advance.
HLS works by defining a playlist (your .m3u8 file) which points to chunks of the video (say segment00000.ts, segment00001.ts, ...). It can contain different streams for different resolutions, but the idea (very simplified) is that your player can download a chunk, play it right away, and download the next one in the meantime.
A WebRTC video track takes RTP packets. I don't know exactly what the Android API exposes, but it feels like either you can pass it an m3u8 as a source (though that may be a bit weird for a generic API), or you can't, and you have to read the HLS stream yourself, extract RTP frames, and pass them to WebRTC.
For instance, gstreamer's hlsdemux seems to be able to read from an HLS source. From there you could extract RTP frames and feed them to your WebRTC track.
Probably that's a bit lower-level than you hoped, but if the Android API doesn't do it for you, you have to do it yourself (or find a library that does it).
I am currently developing a video streaming feature for one of my android apps. I am using android media framework for the purpose. Videos are streamed from an nginx server. Android recorded videos works fine but iOS recorded videos plays only the video not the sound.
It happens because the android support limited codecs in-built like mp3,mp4,mpeg.
While iphone support most of codecs.
What is the way to resolve this?
MP4 for video and MP3 for audio are widely accepted and work on both platforms.
So you need do some stuff at the server. Implement the ffmpeg library that will convert all the videos to MP4 and audio to MP3.
We are doing same mechanism to resolve this issue.
Some more information to understand the problem
Refer stackoverflow answer here
Hope This may help you to get the rid of your problem
Happy Coding!
In my ndroid application, I need to display H264 streams from a GrandStream IP Camera. I saw some topics about decoding H264 frames with MediaCodec in Android, but I really don't know where to start.
Before searching this topic, I thought that there were planty of open source library for that purpose but It seems there is not!
Can you show me a way where to start? Should I use Android's MediaCodec or is there any open source Java library for that?
You can refer to this site, It has a very thoroughly discussion and sample about Android Media Codec
I am developing an application which consists of video-player.
I have got sample And it was working fine for me.
I have pasted the http:// link instead of rtsp link it's not working.
My problem is how to convert the http link to rtsp Link?
I am not exact, neither i know if this will work, because I did not had much success, but there are certain solutions that can convert streams, go through following links, see if it helps.:
http://www.longtailvideo.com/support/forums/jw-player/video-encoding/14320/how-to-convert-wmv-to-mp4-in-real-time
Can you recommend a solution to convert real time stream from pc camera to the format of rtp/rtsp?
RTSP to RTMP streaming
http://real7ime-converter.en.softonic.com/
Any software that can inter convert various other stream formats , will be able to convert http to rtsp also.....
See if it helps.......
I'm trying to play video file on a remote server. Video format is flv and server is Flash Media Server3.5.
I'm going to connect to server over RTMP and to implement the palyback of video file using Android Media Player.
Really,is it possible? Any help is my pleasure.
http://www.aftek.com/afteklab/aftek-RTMP-library.shtml
I found this one, but haven't had much luck, there are very few docs and after jigging it to try and support Video (no examples as i can see) i found that the core method RtmpStreamFactory.getRtmpStream(); failed.
This one has also cropped up, but i haven't looked at this yet.
http://code.google.com/p/android-rtmp-client/
It looks like that for me i'll be looking at getting the media server to deliver rtsp instead and this is supported by android. You may also find that later versions of Android i.e. 3> support rtmp.