In a video player project, I'd like to use LibVLC http streaming from a slow source.
However, I can not get it to stream FIRST, and then continuously download data. The player will always stop in-between.
I'm using vlc-android from GIT.
This is the media player setup code:
ArrayList<String> options = new ArrayList<>();
options.add("--no-sub-autodetect-file");
options.add("--swscale-mode=0");
options.add("--network-caching=60000");
if (BuildConfig.DEBUG) {
options.add("-vvv"); // verbosity
}
libVLC = new LibVLC(options);
mediaPlayer = new org.videolan.libvlc.MediaPlayer(libVLC);
mediaPlayer.setEventListener(this);
final IVLCVout vout = mediaPlayer.getVLCVout();
vout.setVideoView(videoView);
vout.setSubtitlesView(subtitleView);
vout.addCallback(this);
vout.attachViews();
final Media media = new Media(libVLC, getIntent().getData());
media.setHWDecoderEnabled(true, false);
media.addOption(":network-caching=60000");
media.addOption(":clock-jitter=0");
media.addOption(":clock-synchro=0");
mediaPlayer.setMedia(media);
mediaPlayer.play();
I was hoping that setting the :network-caching on the media object is enough, but it seems to still run out of data the whole time.
How to configure LibVLC so that the stutter is eliminated? Some buffer time is OK.
The stream type is a MOV file served via HTTP.
libvlc option try it:
ArrayList<String> options = new ArrayList<String>();
options.add("--audio-time-stretch"); // time stretching
options.add("-vvv"); // verbosity
options.add("--no-audio"); // no audio
options.add("--aout=none");
options.add("--no-sub-autodetect-file");
options.add("--swscale-mode=0");
options.add("--network-caching=400");
options.add("--no-drop-late-frames");
options.add("--no-skip-frames");
options.add("--avcodec-skip-frame");
options.add("--avcodec-hw=any");
media addOption try it:
Media m = new Media(libvlc, Uri.parse(URL));
m.setHWDecoderEnabled(true, true);
m.addOption(":network-caching=5000");
m.addOption(":clock-jitter=0");
m.addOption(":clock-synchro=0");
m.addOption(":codec=all");
mMediaPlayer.setMedia(m);
mMediaPlayer.play();
Related
I am using setMaxBitrate provided by DefaultTrackSelector to set max bit rate when user changes video quality.
val parameters = defaultTrackSelector.buildUponParameters()
.setMaxVideoBitrate(bitrate)
.build()
defaultTrackSelector.parameters = parameters
But as soon as this function is called, the current buffer is discarded & re-buffering is shown right away. Is there any way to keep playing using old buffer & just load the new buffer using the new bitrate settings like YouTube does?
This issue has been discussed here:
https://github.com/google/ExoPlayer/issues/3522
https://github.com/google/ExoPlayer/issues/2250
But there doesn't seem to be any solution yet. Any help regarding this issue would be appreciated. Thanks in advance.
Easily you can do it.You have to use ExoPlayer already and ExoPlayer is provide a seekTo() method.
On this method,You should pass only player current position at which point you stopped before.
Step:-1
You have to change your Quality like 144p to 720p. on this Changing time you have to store your current ExoPlayer current position used this method:-
Private int currentPosition=player.getCurrentPosition();
Step -2
After you have to build your exoplayer media source:-
// Measures bandwidth during playback. Can be null if not required.
DefaultBandwidthMeter bandwidthMeter = new DefaultBandwidthMeter();
// Produces DataSource instances through which media data is loaded.
DataSource.Factory dataSourceFactory = new DefaultDataSourceFactory(this, Util.getUserAgent(this, getString(R.string.app_name)), bandwidthMeter);
// This is the MediaSource representing the media to be played.
MediaSource videoSource = new ExtractorMediaSource.Factory(dataSourceFactory).createMediaSource("Pass Your Video Url");
// Prepare the player with the source.
player.prepare(videoSource);
Step 3:-
check this condition
if (this.currentPosition > 0) {
player.seekTo(this.currentPosition);
this.currentPosition = 0;
player.setPlayWhenReady(true);
} else {
player.setPlayWhenReady(true);
}
and it's work good you have to watch your video in where are you left.
Step 4:-
If your quality his not good that time used is method.
public int getWifiLevel()
{
WifiManager wifiManager = (WifiManager) context.getSystemService(Context.WIFI_SERVICE);
int linkSpeed = wifiManager.getConnectionInfo().getRssi();
int level = WifiManager.calculateSignalLevel(linkSpeed, 5);
return level;
}
Based on wifi level or link speed you can decide if it has the low connection or high connection internet.
Currently, I have a server that streams four RTMP MediaSources, one with 720p video source, one with 360p video source, one with 180p video source, and one audio-only source. If I wanted to switch resolutions, I have to stop the ExoPlayer instance, prepare the other track I wanted to switch to, then play.
The code I use to prepare the ExoPlayer instance:
TrackSelection.Factory adaptiveTrackSelectionFactory = new AdaptiveTrackSelection.Factory(bandwidthMeter);
TrackSelector trackSelector = new DefaultTrackSelector(adaptiveTrackSelectionFactory);
RtmpDataSourceFactory rtmpDataSourceFactory = new RtmpDataSourceFactory(bandwidthMeter);
ExtractorsFactory extractorsFactory = new DefaultExtractorsFactory();
factory = new AVControlExtractorMediaSource.Factory(rtmpDataSourceFactory);
factory.setExtractorsFactory(extractorsFactory);
createSource();
//noinspection deprecation
mPlayer = ExoPlayerFactory.newSimpleInstance(mActivity, trackSelector, new DefaultLoadControl(
new DefaultAllocator(true, C.DEFAULT_BUFFER_SEGMENT_SIZE),
1000, // min buffer
2000, // max buffer
1000, // playback
1000, //playback after rebuffer
DefaultLoadControl.DEFAULT_TARGET_BUFFER_BYTES,
true
));
vwExoPlayer.setPlayer(mPlayer);
mPlayer.addAnalyticsListener(mAnalyticsListener);
With createSource() being:
private void createSource() {
factory.setTrackPlaybackFlag(AVControlExtractorMediaSource.PLAYBACK_BOTH_AV);
mMediaSource180 = factory.createMediaSource(Uri.parse(API.GAME_VIDEO_STREAM_URL_180()));
mMediaSource180.addEventListener(getHandler(), new MSourceDebuggerListener("GameMediaSource180"));
mMediaSource360 = factory.createMediaSource(Uri.parse(API.GAME_VIDEO_STREAM_URL_360()));
mMediaSource360.addEventListener(getHandler(), new MSourceDebuggerListener("GameMediaSource360"));
mMediaSource720 = factory.createMediaSource(Uri.parse(API.GAME_VIDEO_STREAM_URL_720()));
mMediaSource720.addEventListener(getHandler(), new MSourceDebuggerListener("GameMediaSource720"));
factory.setTrackPlaybackFlag(AVControlExtractorMediaSource.PLAYBACK_AUDIO_ONLY);
mMediaSourceAudio = factory.createMediaSource(Uri.parse(API.GAME_AUDIO_STREAM_URL()));
mMediaSourceAudio.addEventListener(getHandler(), new MSourceDebuggerListener("GameMediaSourceAudio"));
}
private void releaseSource() {
mMediaSource180.releaseSource(null);
mMediaSource360.releaseSource(null);
mMediaSource720.releaseSource(null);
mMediaSourceAudio.releaseSource(null);
}
And the code I currently use to switch between these MediaSources is:
private void changeTrack(MediaSource source) {
if (currentMediaSource == source) return;
try {
this.currentMediaSource = source;
mPlayer.stop(true);
mPlayer.prepare(source, true, true);
mPlayer.setPlayWhenReady(true);
if (source == mMediaSourceAudio) {
if (!audioOnly) {
try {
TransitionManager.beginDelayedTransition(rootView);
} catch (Exception ignored) {
}
layAudioOnly.setVisibility(View.VISIBLE);
vwExoPlayer.setVisibility(View.INVISIBLE);
audioOnly = true;
try {
GameQnAFragment fragment = findFragment(GameQnAFragment.class);
if (fragment != null) {
fragment.signAudioOnly();
}
} catch (Exception e) {
Trace.e(e);
}
try {
GamePollingFragment fragment = findFragment(GamePollingFragment.class);
if (fragment != null) {
fragment.signAudioOnly();
}
} catch (Exception e) {
Trace.e(e);
}
}
} else {
if (audioOnly) {
TransitionManager.beginDelayedTransition(rootView);
layAudioOnly.setVisibility(View.GONE);
vwExoPlayer.setVisibility(View.VISIBLE);
audioOnly = false;
}
}
} catch (Exception ignore) {
}
}
I wanted to implement a seamless switching between these MediaSources so that I don't need to stop and re-prepare, but it appears that this feature is not supported by ExoPlayer.
In addition, logging each MediaSource structure with the following code:
MappingTrackSelector.MappedTrackInfo info = ((DefaultTrackSelector)trackSelector).getCurrentMappedTrackInfo();
if(info != null) {
for (int i = 0; i < info.getRendererCount(); i++) {
TrackGroupArray trackGroups = info.getTrackGroups(i);
if (trackGroups.length != 0) {
for(int j = 0; j < trackGroups.length; j++) {
TrackGroup tg = trackGroups.get(j);
for(int k = 0; k < tg.length; k++) {
Log.i("track_info_"+i+"-"+j+"-"+k, tg.getFormat(k)+"");
}
}
}
}
}
Just nets me 1 video format and 1 audio format each.
My current workaround is to prepare another ExoPlayer instance in the background, replace the currently running instance with that upon preparations being complete, and release the old instance. That reduces the lag between the MediaSources somewhat, but doesn't come close to achieving seamless resolution changes like Youtube.
Should I implement my own TrackSelector and jam-pack all the 4 sources into that, should I implement another MediaSource that handles all 4 sources, or should I just tell the colleague who maintains the streams to switch to just one RTMP MediaSource with a sort of manifest that lists all the resolutions available for the AdaptiveTrackSelection to switch between them?
Adaptive Bit Rate Streaming is designed to allow easy switching between different bit rate streams, but it requires the streams to be segmented and the player to download the video segment by segment.
In this way the player can decide which bit rate to choose for the next segment depending on the current network conditions (and the device display size and t type). The player is able to seamlessly, apart from the different bitrate and quality, move from one bit rate to another this way.
See here for some more info: https://stackoverflow.com/a/42365034/334402
All the above relies on a delivery protocol which supports this segmentation and different bit rate streams. The most common ones today are HLS and MPEG-DASH.
The easiest way to support what I think you are looking for would be for you colleague who is supplying the stream to supply it using HLS and/or DASH.
Note that at the moment, both HLS and DASH are required as apple devices require HLS while other devices tend to default to DASH. Traditionally HLS used TS as the container for the video in the segments and DASH used fragmented MP4, but there is now a move for both to use CMAF, which is essentially fragmented MP4.
So in theory a single set of bit rate videos can be used for HLS and DASH now - in practice this will depend on whether your content is encrypted or not, as HLS and apple used one encryption mode and everyone else another in the past. This is changing now also but will take time before all devices support the new approach, where all devices can support the same encryption mode, so if your streams are encrypted this is an added complication at the moment.
I have using WebRTC. I want to local video stream to file.
I'd appreciate it if you gave me a hint to approach this.
Thank you for reading it.
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
//sigConnect("http://10.54.36.19:8000/");
sigConnect("http://unwebrtc.herokuapp.com/");
initWebRTC();
Log.i(TAG, "VideoCapturerAndroid.getDeviceCount() = " + VideoCapturerAndroid.getDeviceCount());
String nameOfFrontFacingDevice = VideoCapturerAndroid.getNameOfFrontFacingDevice();
String nameOfBackFacingDevice = VideoCapturerAndroid.getNameOfBackFacingDevice();
Log.i(TAG, "VideoCapturerAndroid.getNameOfFrontFacingDevice() = " + nameOfFrontFacingDevice);
Log.i(TAG, "VideoCapturerAndroid.getNameOfBackFacingDevice() = " + nameOfBackFacingDevice);
VideoCapturerAndroid capturer = VideoCapturerAndroid.create(nameOfFrontFacingDevice);
MediaConstraints videoConstraints = new MediaConstraints();
VideoSource videoSource = peerConnectionFactory.createVideoSource(capturer, videoConstraints);
localVideoTrack = peerConnectionFactory.createVideoTrack(VIDEO_TRACK_ID, videoSource);
glview = (GLSurfaceView) findViewById(R.id.glview);
VideoRendererGui.setView(glview, null);
try {
rendereRemote = VideoRendererGui.createGui(0, 0, 100, 100, VideoRendererGui.ScalingType.SCALE_ASPECT_FILL, true);
rendereLocal = VideoRendererGui.createGui(72, 72, 25, 25, VideoRendererGui.ScalingType.SCALE_ASPECT_FILL, true);
localVideoTrack.addRenderer(rendereLocal);
} catch (Exception e) {
e.printStackTrace();
}
mediaStream = peerConnectionFactory.createLocalMediaStream(LOCAL_MEDIA_STREAM_ID);
mediaStream.addTrack(localVideoTrack);
}
Libjingle library use GlSurfaceView for rendering video. You may try using FFMPEG library for saving video frames from that view. Not sure about audio though
You have to create video container, like mp4, and manually encode and write every raw frame. Also the latest webrtc version, provides an access to recording audio from microphone. You should encode and mux audio samples as well.
For accessing to raw video frames both, remote and local peers, see interface VideoSink. apprtc/CallActivity.java class ProxyVideoSink)
For getting audio samples see class RecordedAudioToFileController
For creating/muxing video files you can use MediaMuxer class from Android SDK.
We would like to integrate libvlc in our project, replacing Android MediaPlayer.
Project compile, libvlc is initialized withput error but nothing happens.
Instead of render video on a android widget as SurfaceView, GLSurfaceView or TextureView, we musto to render to a SurfaceTexture which GL texture creted in JNI part. We are using this texture to render a sky dome in our game.
Right now we are decoding video to texture with android MediaPlayer and run perfectly. So issues is related with libVLC integration. No trace logs are written so is quite difficult so find where the problem is.
To summary our code:
1 - Set up libvlc library
ArrayList<String> options = new ArrayList<String>();
options.add("-vvv"); // verbosity
libVLC = new LibVLC(options);
libVLC.setOnNativeCrashListener(this);
mediaPlayer = new MediaPlayer(libVLC);
mediaPlayer.setEventListener(this);
2 - Load movie source
if (mediaPath.startsWith("http")) {
media = new Media(libVLC, mediaUri);
this.isLoaded = (media.parse(Media.Parse.FetchNetwork) && getMediaInformation()) ||
(media.parse(Media.Parse.ParseNetwork) && getMediaInformation());
} else if (mediaPath.startsWith("android.resource://")) {
media = new Media(libVLC, mediaUri);
} else {
media = new Media(libVLC, mediaPath);
this.isLoaded = (media.parse(Media.Parse.FetchLocal) && getMediaInformation()) ||
(media.parse(Media.Parse.ParseLocal) && getMediaInformation());
}
3 - Update texture in render loop
synchronized(this) {
if(isFrameNew) {
if(surfaceTexture != null) surfaceTexture.updateTexImage();
isFrameNew = false;
isMoviedone = false;
return true;
}
return false;
}
But nothing. Texture is empty and seems that libvlc is not working internally.
Anyone has the same issue?
Thank in advanced
I've been working on an Android application that shows live streaming video via RTSP.
Assuming I have a well-functioning RTSP server that passes h264 packets, and to view the stream we should connect to rtsp://1.2.3.4:5555/stream
So I tried to use the native MediaPlayer\VideoView, but no luck (the video was stuck after 2-3 seconds of playback, so I loaded mrmaffen's vlc-android-sdk (can be found here) and used the following code:
ArrayList<String> options = new ArrayList<String>();
options.add("--no-drop-late-frames");
options.add("--no-skip-frames");
options.add("-vvv");
videoVlc = new LibVLC(options);
newVideoMediaPlayer = new org.videolan.libvlc.MediaPlayer(videoVlc);
final IVLCVout vOut = newVideoMediaPlayer.getVLCVout();
vOut.addCallback(this);
vOut.setVideoView(videoView); //videoView is a pre-defined view which is part of the layout
vOut.attachViews();
newVideoMediaPlayer.setEventListener(this);
Media videoMedia = new Media (videoVlc, Uri.parse(mVideoPath));
newVideoMediaPlayer.setMedia(videoMedia);
newVideoMediaPlayer.play();
The problem is that I see a blank screen.
Keep in mind that when I put a RTSP link with audio stream only, it works fine.
Is someone familliar with this sdk and have an idea about this issue?
Thanks in advance
Try adding this option:
--rtsp-tcp
I play rtsp streaming with following code
try {
Uri rtspUri=Uri.parse("rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov");
final MediaWrapper mw = new MediaWrapper(rtspUri);
mw.removeFlags(MediaWrapper.MEDIA_FORCE_AUDIO);
mw.addFlags(MediaWrapper.MEDIA_VIDEO);
MediaWrapperListPlayer.getInstance().getMediaList().add(mw);
VLCInstance.getMainMediaPlayer().setEventListener(this);
VLCInstance.get().setOnHardwareAccelerationError(this);
final IVLCVout vlcVout = VLCInstance.getMainMediaPlayer().getVLCVout();
vlcVout.addCallback(this);
vlcVout.setVideoView(mSurfaceView);
vlcVout.attachViews();
final SharedPreferences pref = PreferenceManager.getDefaultSharedPreferences(this);
final String aout = VLCOptions.getAout(pref);
VLCInstance.getMainMediaPlayer().setAudioOutput(aout);
MediaWrapperListPlayer.getInstance().playIndex(this, 0);
} catch (Exception e) {
Log.e(TAG, e.toString());
}
When you get playing event, you need enable video track.
private void onPlaying() {
stopLoadingAnimation();
VLCInstance.getMainMediaPlayer().setVideoTrackEnabled(true);
}
This may be helpful for you