I am successful in streaming the RTSP live video via hikvision IPCamera to my android app. Now I want to record the streaming video in the mobile app itself. How would I do it? Can I have some guidance?
I solved the issue myself. I was successful in streaming the rtsp feed in my app using pedroSG94/vlc-example-streamplayer . But I was unable to record the stream in my android app. After a week of trial and failure using different techniques, I finally found the actual vlc-android compiled sdk posted by vlc themself. This library had both the streaming and recording feature which helped me accomplish my goal. Thank you vlc
I finally found the actual vlc-android compiled sdk posted by vlc themself.
VLC library for android implementation.
('org.videolan.android:libvlc-all:3.3.14')
Use this library for streaming and recording RTSP video
Related
I'm working on android app which can stream video to Facebook via compiled VLC-library. After recent changes in Facebook policy https://developers.facebook.com/blog/post/v2/2019/04/16/live-video-uploads-rtmps/ VLC stopped to stream video. There is message in the log:
standard stream out: no suitable sout access module for
'rtmp/flv://rtmps://live-api-s.facebook.com:443/rtmp/xxxxxxxxx.....'
Can anyone help me to understand - what should be done to re-enable streaming? My guess was to compile VLC with --enable-gnutls flag, but I'm not sure how to do this with current VLC sources
Direct use of Network Stream is one option you can try from here
Overview of the VideoLAN streaming solution -
Documentation
Related to your doubt on stream video to Facebook Preset with rtmp
Which version of VLC on Android are you using?
Could you provide a longer version of the logs?
According to this issue: https://code.videolan.org/videolan/vlc-android/issues/158
By setting the flag --enable-sout in compile-libvlc.sh it should maybe start working
I am currently developing a video streaming feature for one of my android apps. I am using android media framework for the purpose. Videos are streamed from an nginx server. Android recorded videos works fine but iOS recorded videos plays only the video not the sound.
It happens because the android support limited codecs in-built like mp3,mp4,mpeg.
While iphone support most of codecs.
What is the way to resolve this?
MP4 for video and MP3 for audio are widely accepted and work on both platforms.
So you need do some stuff at the server. Implement the ffmpeg library that will convert all the videos to MP4 and audio to MP3.
We are doing same mechanism to resolve this issue.
Some more information to understand the problem
Refer stackoverflow answer here
Hope This may help you to get the rid of your problem
Happy Coding!
I am trying to achieve HLS streaming in Android.
I have setup the HLS streaming server (apache2) in Ubuntu desktop and able to play the stream using the VLC player on Desktop.
But when i try to play the stream using VLC player in Android, I am not play the video, nor I am getting any error.
If anyone has tried similar streaming, please provide your inputs.
Thanks
Following some further investigation, I've found the following information that can hopefully help other people get HLS streaming on Android working.
Encoding - The video encoding, and the segmentation setup can have a large impact on the Android versions that the video supports. I ended up creating a video using HandBrake, with the following settings:
MP4 File
H.264; Baseline Profile; Level 3
AAC Audio; 44.1k; 128bit (Note: I found that JellyBean was a lot more picky about the audio than ICS/Honeycomb. Some audio bitrates would create videos that Jellybean would not play at all. In general Mono and low bitrate audio seemed to work better on Jellybean).
Segmentation - Using the Apple MediaFileSegmenter, I found adding the "-no-floating-point-duration" and "-z none" flags allowed me to create a video that worked across Android 3.0->4.2
Gingerbread - I was unable to get Android 2.3 to work with HLS out of the box, but I did find that using the Vitamio library worked pretty well (see this question for further info)
I want to playback RTP video session that is being send to my tablet through a socket port , but the media player in android 3.0 only support RTSP or file sources , what is the best way to implement this ?
Unfortunately this is not possible with Android SDK. There is even an issue to resolve this http://code.google.com/p/android/issues/detail?id=8959&q=rtp&colspec=ID%20Type%20Status%20Owner%20Summary%20Stars
Your best option would be to explore using ffmpeg under Android.
the easy way:
download videolan.
and check this how to streaming with videolan
the "complicated" way is creating a server with gstreamer rtspserver, darwin... real... etc
I'm trying to play video file on a remote server. Video format is flv and server is Flash Media Server3.5.
I'm going to connect to server over RTMP and to implement the palyback of video file using Android Media Player.
Really,is it possible? Any help is my pleasure.
http://www.aftek.com/afteklab/aftek-RTMP-library.shtml
I found this one, but haven't had much luck, there are very few docs and after jigging it to try and support Video (no examples as i can see) i found that the core method RtmpStreamFactory.getRtmpStream(); failed.
This one has also cropped up, but i haven't looked at this yet.
http://code.google.com/p/android-rtmp-client/
It looks like that for me i'll be looking at getting the media server to deliver rtsp instead and this is supported by android. You may also find that later versions of Android i.e. 3> support rtmp.