vlc-android-sdk - cannot view RTSP live video - android

I've been working on an Android application that shows live streaming video via RTSP.
Assuming I have a well-functioning RTSP server that passes h264 packets, and to view the stream we should connect to rtsp://1.2.3.4:5555/stream
So I tried to use the native MediaPlayer\VideoView, but no luck (the video was stuck after 2-3 seconds of playback, so I loaded mrmaffen's vlc-android-sdk (can be found here) and used the following code:
ArrayList<String> options = new ArrayList<String>();
options.add("--no-drop-late-frames");
options.add("--no-skip-frames");
options.add("-vvv");
videoVlc = new LibVLC(options);
newVideoMediaPlayer = new org.videolan.libvlc.MediaPlayer(videoVlc);
final IVLCVout vOut = newVideoMediaPlayer.getVLCVout();
vOut.addCallback(this);
vOut.setVideoView(videoView); //videoView is a pre-defined view which is part of the layout
vOut.attachViews();
newVideoMediaPlayer.setEventListener(this);
Media videoMedia = new Media (videoVlc, Uri.parse(mVideoPath));
newVideoMediaPlayer.setMedia(videoMedia);
newVideoMediaPlayer.play();
The problem is that I see a blank screen.
Keep in mind that when I put a RTSP link with audio stream only, it works fine.
Is someone familliar with this sdk and have an idea about this issue?
Thanks in advance

Try adding this option:
--rtsp-tcp

I play rtsp streaming with following code
try {
Uri rtspUri=Uri.parse("rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov");
final MediaWrapper mw = new MediaWrapper(rtspUri);
mw.removeFlags(MediaWrapper.MEDIA_FORCE_AUDIO);
mw.addFlags(MediaWrapper.MEDIA_VIDEO);
MediaWrapperListPlayer.getInstance().getMediaList().add(mw);
VLCInstance.getMainMediaPlayer().setEventListener(this);
VLCInstance.get().setOnHardwareAccelerationError(this);
final IVLCVout vlcVout = VLCInstance.getMainMediaPlayer().getVLCVout();
vlcVout.addCallback(this);
vlcVout.setVideoView(mSurfaceView);
vlcVout.attachViews();
final SharedPreferences pref = PreferenceManager.getDefaultSharedPreferences(this);
final String aout = VLCOptions.getAout(pref);
VLCInstance.getMainMediaPlayer().setAudioOutput(aout);
MediaWrapperListPlayer.getInstance().playIndex(this, 0);
} catch (Exception e) {
Log.e(TAG, e.toString());
}
When you get playing event, you need enable video track.
private void onPlaying() {
stopLoadingAnimation();
VLCInstance.getMainMediaPlayer().setVideoTrackEnabled(true);
}
This may be helpful for you

Related

How to record WebRTC local video stream to file on Android?

I have using WebRTC. I want to local video stream to file.
I'd appreciate it if you gave me a hint to approach this.
Thank you for reading it.
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
//sigConnect("http://10.54.36.19:8000/");
sigConnect("http://unwebrtc.herokuapp.com/");
initWebRTC();
Log.i(TAG, "VideoCapturerAndroid.getDeviceCount() = " + VideoCapturerAndroid.getDeviceCount());
String nameOfFrontFacingDevice = VideoCapturerAndroid.getNameOfFrontFacingDevice();
String nameOfBackFacingDevice = VideoCapturerAndroid.getNameOfBackFacingDevice();
Log.i(TAG, "VideoCapturerAndroid.getNameOfFrontFacingDevice() = " + nameOfFrontFacingDevice);
Log.i(TAG, "VideoCapturerAndroid.getNameOfBackFacingDevice() = " + nameOfBackFacingDevice);
VideoCapturerAndroid capturer = VideoCapturerAndroid.create(nameOfFrontFacingDevice);
MediaConstraints videoConstraints = new MediaConstraints();
VideoSource videoSource = peerConnectionFactory.createVideoSource(capturer, videoConstraints);
localVideoTrack = peerConnectionFactory.createVideoTrack(VIDEO_TRACK_ID, videoSource);
glview = (GLSurfaceView) findViewById(R.id.glview);
VideoRendererGui.setView(glview, null);
try {
rendereRemote = VideoRendererGui.createGui(0, 0, 100, 100, VideoRendererGui.ScalingType.SCALE_ASPECT_FILL, true);
rendereLocal = VideoRendererGui.createGui(72, 72, 25, 25, VideoRendererGui.ScalingType.SCALE_ASPECT_FILL, true);
localVideoTrack.addRenderer(rendereLocal);
} catch (Exception e) {
e.printStackTrace();
}
mediaStream = peerConnectionFactory.createLocalMediaStream(LOCAL_MEDIA_STREAM_ID);
mediaStream.addTrack(localVideoTrack);
}
Libjingle library use GlSurfaceView for rendering video. You may try using FFMPEG library for saving video frames from that view. Not sure about audio though
You have to create video container, like mp4, and manually encode and write every raw frame. Also the latest webrtc version, provides an access to recording audio from microphone. You should encode and mux audio samples as well.
For accessing to raw video frames both, remote and local peers, see interface VideoSink. apprtc/CallActivity.java class ProxyVideoSink)
For getting audio samples see class RecordedAudioToFileController
For creating/muxing video files you can use MediaMuxer class from Android SDK.

Set min-buffer threshold for http/network playback

In a video player project, I'd like to use LibVLC http streaming from a slow source.
However, I can not get it to stream FIRST, and then continuously download data. The player will always stop in-between.
I'm using vlc-android from GIT.
This is the media player setup code:
ArrayList<String> options = new ArrayList<>();
options.add("--no-sub-autodetect-file");
options.add("--swscale-mode=0");
options.add("--network-caching=60000");
if (BuildConfig.DEBUG) {
options.add("-vvv"); // verbosity
}
libVLC = new LibVLC(options);
mediaPlayer = new org.videolan.libvlc.MediaPlayer(libVLC);
mediaPlayer.setEventListener(this);
final IVLCVout vout = mediaPlayer.getVLCVout();
vout.setVideoView(videoView);
vout.setSubtitlesView(subtitleView);
vout.addCallback(this);
vout.attachViews();
final Media media = new Media(libVLC, getIntent().getData());
media.setHWDecoderEnabled(true, false);
media.addOption(":network-caching=60000");
media.addOption(":clock-jitter=0");
media.addOption(":clock-synchro=0");
mediaPlayer.setMedia(media);
mediaPlayer.play();
I was hoping that setting the :network-caching on the media object is enough, but it seems to still run out of data the whole time.
How to configure LibVLC so that the stutter is eliminated? Some buffer time is OK.
The stream type is a MOV file served via HTTP.
libvlc option try it:
ArrayList<String> options = new ArrayList<String>();
options.add("--audio-time-stretch"); // time stretching
options.add("-vvv"); // verbosity
options.add("--no-audio"); // no audio
options.add("--aout=none");
options.add("--no-sub-autodetect-file");
options.add("--swscale-mode=0");
options.add("--network-caching=400");
options.add("--no-drop-late-frames");
options.add("--no-skip-frames");
options.add("--avcodec-skip-frame");
options.add("--avcodec-hw=any");
media addOption try it:
Media m = new Media(libvlc, Uri.parse(URL));
m.setHWDecoderEnabled(true, true);
m.addOption(":network-caching=5000");
m.addOption(":clock-jitter=0");
m.addOption(":clock-synchro=0");
m.addOption(":codec=all");
mMediaPlayer.setMedia(m);
mMediaPlayer.play();

How to use ExoPlayer

I want to use ExoPlayer in my app. Could you please tell me which is simplest example? I have tried to do likely https://github.com/google/ExoPlayer/ but it's not easy for me. I tried to import library as module then i received bintray-release error.
As stated in the main Readme.md, you can import ExoPlayer as you will do for any other dependencies :
In your app build.gradle > dependencies add :
compile 'com.google.android.exoplayer:exoplayer:rX.X.X'
The current version is r1.5.1 as of October 27, 2015. see here.
Old question but since there are too few simple ExoPlayer tutorials out there, I wrote this up. I recently converted an app I have from using Android's default media player to ExoPlayer. The performance gains are amazing and it works on a wider range of devices. It is a bit more complicated, however.
This example is tailored specifically to playing an http audio stream but by experimenting you can probably adapt it easily to anything else. This example uses the latest v1.xx of ExoPlayer, currently v1.5.11:
First, put this in your build.gradle (Module: app) file, under "dependencies":
compile 'com.google.android.exoplayer:exoplayer:r1.5.11'
Also your class should implement ExoPlayer.Listener:
...implements ExoPlayer.Listener
Now here's the relevant code to play an http audio stream:
private static final int RENDERER_COUNT = 1; //since we want to render simple audio
private static final int BUFFER_SEGMENT_SIZE = 64 * 1024; // for http mp3 audio stream use these values
private static final int BUFFER_SEGMENT_COUNT = 256; // for http mp3 audio steam use these values
private ExoPlayer exoPlayer;
// for http mp3 audio stream, use these values
int minBufferMs = 1000;
int minRebufferMs = 5000;
// Prepare ExoPlayer
exoPlayer = ExoPlayer.Factory.newInstance(RENDERER_COUNT, minBufferMs, minRebufferMs);
// String with the url of the stream to play
String stream_location = "http://audio_stream_url";
// Convert String URL to Uri
Uri streamUri = Uri.parse(stream_location);
// Settings for ExoPlayer
Allocator allocator = new DefaultAllocator(BUFFER_SEGMENT_SIZE);
String userAgent = Util.getUserAgent(ChicagoPoliceRadioService.this, "ExoPlayer_Test");
DataSource dataSource = new DefaultUriDataSource(ChicagoPoliceRadioService.this, null, userAgent);
ExtractorSampleSource sampleSource = new ExtractorSampleSource(
streamUri, dataSource, allocator, BUFFER_SEGMENT_SIZE * BUFFER_SEGMENT_COUNT);
MediaCodecAudioTrackRenderer audioRenderer = new MediaCodecAudioTrackRenderer(sampleSource, MediaCodecSelector.DEFAULT);
// Attach listener we implemented in this class to this ExoPlayer instance
exoPlayer.addListener(this);
// Prepare ExoPlayer
exoPlayer.prepare(audioRenderer);
// Set full volume
exoPlayer.sendMessage(audioRenderer, MediaCodecAudioTrackRenderer.MSG_SET_VOLUME, 1f);
// Play!
exoPlayer.setPlayWhenReady(true);
There are three callback methods:
#Override
public void onPlayWhenReadyCommitted() {
// No idea what would go here, I left it empty
}
// Called when ExoPlayer state changes
#Override
public void onPlayerStateChanged(boolean playWhenReady, int playbackState) {
// If playbackState equals STATE_READY (4), that means ExoPlayer is set to
// play and there are no errors
if (playbackState == ExoPlayer.STATE_READY) {
// ExoPlayer prepared and ready, no error
// Put code here, same as "onPrepared()"
}
}
// Called on ExoPlayer error
#Override
public void onPlayerError(ExoPlaybackException error) {
// ExoPlayer error occurred
// Put your error code here
}
And when you're done playing do the usual:
if (exoPlayer != null) {
exoPlayer.stop();
exoPlayer.release();
}
NOTE: I'm still not 100% sure about the details of all of the ExoPlayer settings. I've never tried playing video. Note this is for version 1.5.x of ExoPlayer, 2.0 changed a lot and I still haven't figured it out. I do highly recommend this code to anyone who has an app that streams audio from the web as the performance gains are incredible and for my app it fixed an issue with Samsung phones that would only play about 30sec of audio before stopping.

How to play video in video view from url based on buffer %age in android?

I have a link of video from s3 server and i am playing this video on VideoView. Video is playing properly but the problem is that first it downloads the entire video then plays it.
I want it play like buffer. I mean if 20 % video downloaded it should play those and then again download (Like in youtube). Here is my code what i have done is..
FFmpegMediaMetadataRetriever mediaMetadataRetriever = new FFmpegMediaMetadataRetriever();
AWSCredentials myCredentials = new BasicAWSCredentials(
"AKIAIGOIY4LLB7EMACGQ",
"7wNQeY1JC0uyMaGYhKBKc9V7QC7X4ecBtyLimt2l");
AmazonS3 s3client = new AmazonS3Client(myCredentials);
GeneratePresignedUrlRequest request = new GeneratePresignedUrlRequest(
"mgvtest", videoUrl);
URL objectURL = s3client.generatePresignedUrl(request);
try {
mediaMetadataRetriever.setDataSource(videoUrl);
} catch (Exception e) {
utilDialog.showDialog("Unable to load this video",
utilDialog.ALERT_DIALOG);
pb.setVisibility(View.INVISIBLE);
}
videoView.setVideoURI(Uri.parse(videoUrl));
MediaController myMediaController = new MediaController(this);
// myMediaController.setMediaPlayer(videoView);
videoView.setMediaController(myMediaController);
videoView.setOnCompletionListener(myVideoViewCompletionListener);
videoView.setOnPreparedListener(MyVideoViewPreparedListener);
videoView.setOnErrorListener(myVideoViewErrorListener);
videoView.requestFocus();
videoView.start();
Listeners
MediaPlayer.OnCompletionListener myVideoViewCompletionListener = new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer arg0) {
// Toast.makeText(PlayRecordedVideoActivity.this, "End of Video",
// Toast.LENGTH_LONG).show();
}
};
MediaPlayer.OnPreparedListener MyVideoViewPreparedListener = new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
pb.setVisibility(View.INVISIBLE);
imgScreenshot.setVisibility(View.VISIBLE);
tvScreenshot.setVisibility(View.VISIBLE);
// final Animation in = new AlphaAnimation(0.0f, 1.0f);
// in.setDuration(3000);
// tvScreenshot.startAnimation(in);
Animation animation = AnimationUtils.loadAnimation(
getApplicationContext(), R.anim.zoom_in);
tvScreenshot.startAnimation(animation);
new Handler().postDelayed(new Runnable() {
#Override
public void run() {
tvScreenshot.setVisibility(View.INVISIBLE);
}
}, 3000);
}
};
MediaPlayer.OnErrorListener myVideoViewErrorListener = new MediaPlayer.OnErrorListener() {
#Override
public boolean onError(MediaPlayer mp, int what, int extra) {
// Toast.makeText(PlayRecordedVideoActivity.this, "Error!!!",
// Toast.LENGTH_LONG).show();
return true;
}
};
To be able to start playing an mp4 video before it has fully downloaded the video has to have the metadata at the start of the video rather than the end - unfortunately, with standard mp4 the default it usually to have it at the end.
The metadata is in an 'atom' or 'box' (basically a data structure within the mp4 file) and can be moved to the start. This is usually referred to as faststart and tools such as ffmpeg will allow you do this. The following is an extract from the ffmpeg documentation:
The mov/mp4/ismv muxer supports fragmentation. Normally, a MOV/MP4 file has all the metadata about all packets stored in one location (written at the end of the file, it can be moved to the start for better playback by adding faststart to the movflags, or using the qt-faststart tool).
There are other tools and software which will allow you do this also - e.g. the one mentioned in the ffmpeg extract above:
http://multimedia.cx/eggs/improving-qt-faststart/
If you actually want full streaming where the server breaks the file into chunks and these are downloaded one by one by the client, then you probably want to use one of the adaptive bit rate protocols (Apple's HLS, MS's Smoothstreaming, Adobe Adaptive Streaming or the new open standard DASH). This also allows you have different bit rates to allow for different network conditions. You will need a server that can support this functionality to use these techniques. This may be overkill if you just want a simple site with a single video and will not have too much traffic.
Actually you have to start cloudfront with s3, so can stream s3 videos,
checkout this link for more information:
http://www.miracletutorials.com/s3-streaming-video-with-cloudfront/

Android screen sharing using WebRTC

I have heard about screen sharing on desktop using WebRTC. But for the Android, it seems not to have much information.
My question is:
Is it possible to use WebRTC for screen sharing on android?. I mean I can cast the current screen to the other phone's screen.
If 1 is Yes, How can I achieve this?
Thanks.
It is possible!
It can be done using the directions below.
I've used ScreenShareRTC in conjunction with ProjectRTC to stream the contents of the screen to a browser with decent quality and fairly low latency ~100ms.
I've added an example below that shows how to configure a screen share as a video source and add it as a track on a stream.
Get the VideoCapturer
#TargetApi(21)
private VideoCapturer createScreenCapturer() {
if (mMediaProjectionPermissionResultCode != Activity.RESULT_OK) {
report("User didn't give permission to capture the screen.");
return null;
}
return new ScreenCapturerAndroid(
mMediaProjectionPermissionResultData, new MediaProjection.Callback() {
#Override
public void onStop() {
report("User revoked permission to capture the screen.");
}
});
}
Initialize the capturer and add the tracks to the local media stream
private void initScreenCapturStream() {
mLocalMediaStream = factory.createLocalMediaStream("ARDAMS");
MediaConstraints videoConstraints = new MediaConstraints();
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxHeight", Integer.toString(mPeerConnParams.videoHeight)));
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxWidth", Integer.toString(mPeerConnParams.videoWidth)));
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxFrameRate", Integer.toString(mPeerConnParams.videoFps)));
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("minFrameRate", Integer.toString(mPeerConnParams.videoFps)));
mVideoSource = factory.createVideoSource(videoCapturer);
videoCapturer.startCapture(mPeerConnParams.videoWidth, mPeerConnParams.videoHeight, mPeerConnParams.videoFps);
VideoTrack localVideoTrack = factory.createVideoTrack(VIDEO_TRACK_ID, mVideoSource);
localVideoTrack.setEnabled(true);
mLocalMediaStream.addTrack(factory.createVideoTrack("ARDAMSv0", mVideoSource));
AudioSource audioSource = factory.createAudioSource(new MediaConstraints());
mLocalMediaStream.addTrack(factory.createAudioTrack("ARDAMSa0", audioSource));
mListener.onStatusChanged("STREAMING");
}
For more information this might be a good place to start. Its a Android project that connects to a ProjectRTC signalling server and shares the screen as video. I found it very helpful!
Android screen sharing project(Android client - Java)
https://github.com/Jeffiano/ScreenShareRTC
ProjectRTC(Node server)
https://github.com/pchab/ProjectRTC

Categories

Resources