Does Chromecast support Apple HLS? - android

In Chromecast's documentation it is stated that Chromecast does support HLS streams. But I cannot make it work.
I created a simple app that have one button that initiates playback. For MP4 files it works great, but not for HLS streams. This is how I start stream:
btnStart.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
String url = "http://planeta-online.tv:1935/vod/mp4:tvt.russia.time_laps_in_st_piter.flv/manifest.m3u8";
MediaMetadata movieMetadata = new MediaMetadata(MediaMetadata.MEDIA_TYPE_MOVIE);
movieMetadata.putString(MediaMetadata.KEY_SUBTITLE, "Subtitle");
movieMetadata.putString(MediaMetadata.KEY_TITLE, "Title");
movieMetadata.putString(MediaMetadata.KEY_STUDIO, "Studio");
movieMetadata.addImage(new WebImage(Uri.parse("http://cdn.planeta-online.tv/kp/app/icons/i5.png")));
movieMetadata.addImage(new WebImage(Uri.parse("http://cdn.planeta-online.tv/kp/app/icons/i5.png")));
MediaInfo mSelectedMedia = new MediaInfo.Builder(url)
.setStreamType(MediaInfo.STREAM_TYPE_BUFFERED)
.setContentType("application/vnd.apple.mpegurl")
.setMetadata(movieMetadata)
.build();
mVideoCastManager.startCastControllerActivity(MainActivity.this, mSelectedMedia, 0, true);
}
});
I enabled CORS for this stream on Wowza, but it didn't help. Changing StreamType from MediaInfo.STREAM_TYPE_BUFFERED to MediaInfo.STREAM_TYPE_LIVE doesn't have any effect either. Any other ideas?

What receiver are you using? You can use either the Default receiver, or a Styled receiver; they both support HLS. Handling of adaptive streams is handled by the Media Player Library (MPL) and if you want to write your own custom receiver, you need to use that library; we have a sample that does that.

I've found that setting the contentType of the stream to "video/m3u" or "video/m3u8" (depending on your format) worked for my HLS streams with the Default receiver.

Related

Exoplayer cast extension with live streams

I'm trying the implement cast support to my androapp, when I use normal non live links such as MP4 and even HLS non live stream it works perfectly but, when I use live stream the stream just won't play at the chromecast.
This is how I create my MediaInfo:
MediaMetadata movieMetadata = new MediaMetadata(MediaMetadata.MEDIA_TYPE_MOVIE);
movieMetadata.putString(MediaMetadata.KEY_TITLE, "Test");
MediaInfo mediaInfo = new MediaInfo.Builder("HLS link")
.setStreamType(MediaInfo.STREAM_TYPE_LIVE).setContentType(MimeTypes.APPLICATION_M3U8)
.setMetadata(movieMetadata).build();
This is how I load the items:
castPlayer.loadItem(new MediaQueueItem.Builder(setCastMedia()).build(), 0);
This is the logcat I get:
W/MediaControlChannel: received unexpected error: Invalid Request. W/MediaQueue: Error fetching queue item ids, statusCode=2100, statusMessage=null
Any ideas what I'm doing wrong?
Happens the same with TS type streams...
With the following code, you can cast live streams using ExoPlayer.
MediaMetadata movieMetadata = new MediaMetadata(MediaMetadata.MEDIA_TYPE_TV_SHOW);
movieMetadata.putString(MediaMetadata.KEY_TITLE, "Title");
MediaInfo mediaInfo = new MediaInfo.Builder("your_link.m3u8")
.setStreamType(MediaInfo.STREAM_TYPE_LIVE)
.setContentType(MimeTypes.APPLICATION_M3U8)
.setMetadata(movieMetadata)
.build();
MediaQueueItem mediaQueueItem = new MediaQueueItem.Builder(mediaInfo).build();
castPlayer.loadItem(mediaQueueItem, mediaQueueItem.getMedia().getStreamDuration());
castPlayer.setPlayWhenReady(true);

How to pragmatically set a videoCodec to LinphoneCore in Android?

I am currently Developing a VoIP Android Application, and for VoIP support, I am using an Open source library Linphone.
Currently voice calling is happening, but video calling is not happening. After analyzing for a while, I came to know that by default when the app is loaded, the LinphoneCore library is using the H264 video codec.
But the VOIP Asterik server is configured with the VP8 video codec. I cannot change the video codec, which is configured in the server. Hence due to a codec mismatch, video data is not going.
So how can I set manually the video codec to VP8 from my app to LinphoneCore once the app is loaded?
To set videoCodec to LinphoneCore, what you can do is , once your LinphoneCore is ready, you can just Retrieve the VideoCodec Payload that it supports and then set a particular payload and disable others as shown below in the code.
private void enableVp8Codec () {
LinphoneCore lc = LinphoneManager.getLcIfManagerNotDestroyedOrNull();
if (lc != null) {
PayloadType[] lPayLoadArr = lc.getVideoCodecs();
for (final PayloadType pt : lPayLoadArr) {
try {
if (pt.getMime().equals("VP8")) {
lc.enablePayloadType(pt, true);
} else {
lc.enablePayloadType(pt, false);
}
} catch (LinphoneCoreException e) {
Log.e("tag",e.getMessage());
}
}
}
}
This method you can probably call in onResume of your Activity

vlc-android-sdk - cannot view RTSP live video

I've been working on an Android application that shows live streaming video via RTSP.
Assuming I have a well-functioning RTSP server that passes h264 packets, and to view the stream we should connect to rtsp://1.2.3.4:5555/stream
So I tried to use the native MediaPlayer\VideoView, but no luck (the video was stuck after 2-3 seconds of playback, so I loaded mrmaffen's vlc-android-sdk (can be found here) and used the following code:
ArrayList<String> options = new ArrayList<String>();
options.add("--no-drop-late-frames");
options.add("--no-skip-frames");
options.add("-vvv");
videoVlc = new LibVLC(options);
newVideoMediaPlayer = new org.videolan.libvlc.MediaPlayer(videoVlc);
final IVLCVout vOut = newVideoMediaPlayer.getVLCVout();
vOut.addCallback(this);
vOut.setVideoView(videoView); //videoView is a pre-defined view which is part of the layout
vOut.attachViews();
newVideoMediaPlayer.setEventListener(this);
Media videoMedia = new Media (videoVlc, Uri.parse(mVideoPath));
newVideoMediaPlayer.setMedia(videoMedia);
newVideoMediaPlayer.play();
The problem is that I see a blank screen.
Keep in mind that when I put a RTSP link with audio stream only, it works fine.
Is someone familliar with this sdk and have an idea about this issue?
Thanks in advance
Try adding this option:
--rtsp-tcp
I play rtsp streaming with following code
try {
Uri rtspUri=Uri.parse("rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov");
final MediaWrapper mw = new MediaWrapper(rtspUri);
mw.removeFlags(MediaWrapper.MEDIA_FORCE_AUDIO);
mw.addFlags(MediaWrapper.MEDIA_VIDEO);
MediaWrapperListPlayer.getInstance().getMediaList().add(mw);
VLCInstance.getMainMediaPlayer().setEventListener(this);
VLCInstance.get().setOnHardwareAccelerationError(this);
final IVLCVout vlcVout = VLCInstance.getMainMediaPlayer().getVLCVout();
vlcVout.addCallback(this);
vlcVout.setVideoView(mSurfaceView);
vlcVout.attachViews();
final SharedPreferences pref = PreferenceManager.getDefaultSharedPreferences(this);
final String aout = VLCOptions.getAout(pref);
VLCInstance.getMainMediaPlayer().setAudioOutput(aout);
MediaWrapperListPlayer.getInstance().playIndex(this, 0);
} catch (Exception e) {
Log.e(TAG, e.toString());
}
When you get playing event, you need enable video track.
private void onPlaying() {
stopLoadingAnimation();
VLCInstance.getMainMediaPlayer().setVideoTrackEnabled(true);
}
This may be helpful for you

How to play video in video view from url based on buffer %age in android?

I have a link of video from s3 server and i am playing this video on VideoView. Video is playing properly but the problem is that first it downloads the entire video then plays it.
I want it play like buffer. I mean if 20 % video downloaded it should play those and then again download (Like in youtube). Here is my code what i have done is..
FFmpegMediaMetadataRetriever mediaMetadataRetriever = new FFmpegMediaMetadataRetriever();
AWSCredentials myCredentials = new BasicAWSCredentials(
"AKIAIGOIY4LLB7EMACGQ",
"7wNQeY1JC0uyMaGYhKBKc9V7QC7X4ecBtyLimt2l");
AmazonS3 s3client = new AmazonS3Client(myCredentials);
GeneratePresignedUrlRequest request = new GeneratePresignedUrlRequest(
"mgvtest", videoUrl);
URL objectURL = s3client.generatePresignedUrl(request);
try {
mediaMetadataRetriever.setDataSource(videoUrl);
} catch (Exception e) {
utilDialog.showDialog("Unable to load this video",
utilDialog.ALERT_DIALOG);
pb.setVisibility(View.INVISIBLE);
}
videoView.setVideoURI(Uri.parse(videoUrl));
MediaController myMediaController = new MediaController(this);
// myMediaController.setMediaPlayer(videoView);
videoView.setMediaController(myMediaController);
videoView.setOnCompletionListener(myVideoViewCompletionListener);
videoView.setOnPreparedListener(MyVideoViewPreparedListener);
videoView.setOnErrorListener(myVideoViewErrorListener);
videoView.requestFocus();
videoView.start();
Listeners
MediaPlayer.OnCompletionListener myVideoViewCompletionListener = new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer arg0) {
// Toast.makeText(PlayRecordedVideoActivity.this, "End of Video",
// Toast.LENGTH_LONG).show();
}
};
MediaPlayer.OnPreparedListener MyVideoViewPreparedListener = new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
pb.setVisibility(View.INVISIBLE);
imgScreenshot.setVisibility(View.VISIBLE);
tvScreenshot.setVisibility(View.VISIBLE);
// final Animation in = new AlphaAnimation(0.0f, 1.0f);
// in.setDuration(3000);
// tvScreenshot.startAnimation(in);
Animation animation = AnimationUtils.loadAnimation(
getApplicationContext(), R.anim.zoom_in);
tvScreenshot.startAnimation(animation);
new Handler().postDelayed(new Runnable() {
#Override
public void run() {
tvScreenshot.setVisibility(View.INVISIBLE);
}
}, 3000);
}
};
MediaPlayer.OnErrorListener myVideoViewErrorListener = new MediaPlayer.OnErrorListener() {
#Override
public boolean onError(MediaPlayer mp, int what, int extra) {
// Toast.makeText(PlayRecordedVideoActivity.this, "Error!!!",
// Toast.LENGTH_LONG).show();
return true;
}
};
To be able to start playing an mp4 video before it has fully downloaded the video has to have the metadata at the start of the video rather than the end - unfortunately, with standard mp4 the default it usually to have it at the end.
The metadata is in an 'atom' or 'box' (basically a data structure within the mp4 file) and can be moved to the start. This is usually referred to as faststart and tools such as ffmpeg will allow you do this. The following is an extract from the ffmpeg documentation:
The mov/mp4/ismv muxer supports fragmentation. Normally, a MOV/MP4 file has all the metadata about all packets stored in one location (written at the end of the file, it can be moved to the start for better playback by adding faststart to the movflags, or using the qt-faststart tool).
There are other tools and software which will allow you do this also - e.g. the one mentioned in the ffmpeg extract above:
http://multimedia.cx/eggs/improving-qt-faststart/
If you actually want full streaming where the server breaks the file into chunks and these are downloaded one by one by the client, then you probably want to use one of the adaptive bit rate protocols (Apple's HLS, MS's Smoothstreaming, Adobe Adaptive Streaming or the new open standard DASH). This also allows you have different bit rates to allow for different network conditions. You will need a server that can support this functionality to use these techniques. This may be overkill if you just want a simple site with a single video and will not have too much traffic.
Actually you have to start cloudfront with s3, so can stream s3 videos,
checkout this link for more information:
http://www.miracletutorials.com/s3-streaming-video-with-cloudfront/

Android screen sharing using WebRTC

I have heard about screen sharing on desktop using WebRTC. But for the Android, it seems not to have much information.
My question is:
Is it possible to use WebRTC for screen sharing on android?. I mean I can cast the current screen to the other phone's screen.
If 1 is Yes, How can I achieve this?
Thanks.
It is possible!
It can be done using the directions below.
I've used ScreenShareRTC in conjunction with ProjectRTC to stream the contents of the screen to a browser with decent quality and fairly low latency ~100ms.
I've added an example below that shows how to configure a screen share as a video source and add it as a track on a stream.
Get the VideoCapturer
#TargetApi(21)
private VideoCapturer createScreenCapturer() {
if (mMediaProjectionPermissionResultCode != Activity.RESULT_OK) {
report("User didn't give permission to capture the screen.");
return null;
}
return new ScreenCapturerAndroid(
mMediaProjectionPermissionResultData, new MediaProjection.Callback() {
#Override
public void onStop() {
report("User revoked permission to capture the screen.");
}
});
}
Initialize the capturer and add the tracks to the local media stream
private void initScreenCapturStream() {
mLocalMediaStream = factory.createLocalMediaStream("ARDAMS");
MediaConstraints videoConstraints = new MediaConstraints();
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxHeight", Integer.toString(mPeerConnParams.videoHeight)));
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxWidth", Integer.toString(mPeerConnParams.videoWidth)));
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxFrameRate", Integer.toString(mPeerConnParams.videoFps)));
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("minFrameRate", Integer.toString(mPeerConnParams.videoFps)));
mVideoSource = factory.createVideoSource(videoCapturer);
videoCapturer.startCapture(mPeerConnParams.videoWidth, mPeerConnParams.videoHeight, mPeerConnParams.videoFps);
VideoTrack localVideoTrack = factory.createVideoTrack(VIDEO_TRACK_ID, mVideoSource);
localVideoTrack.setEnabled(true);
mLocalMediaStream.addTrack(factory.createVideoTrack("ARDAMSv0", mVideoSource));
AudioSource audioSource = factory.createAudioSource(new MediaConstraints());
mLocalMediaStream.addTrack(factory.createAudioTrack("ARDAMSa0", audioSource));
mListener.onStatusChanged("STREAMING");
}
For more information this might be a good place to start. Its a Android project that connects to a ProjectRTC signalling server and shares the screen as video. I found it very helpful!
Android screen sharing project(Android client - Java)
https://github.com/Jeffiano/ScreenShareRTC
ProjectRTC(Node server)
https://github.com/pchab/ProjectRTC

Categories

Resources