I'm using webrtc-android-framework module provided by Antmedia official website. I was able to make connection and I can see the video published on the other side without any issues. However I'm unable to switch from camera to screen sharing.
I'm using below code to switch from camera capture to screen sharing.
public void MakeScreenCaptureReady() {
final EglBase.Context eglBaseContext = eglBase.getEglBaseContext();
PeerConnectionFactory peerConnectionFactory = peerConnectionClient.factory;
// create AudioSource
AudioSource audioSource = peerConnectionFactory.createAudioSource(new MediaConstraints());
this.audioTrack = peerConnectionFactory.createAudioTrack("101", audioSource);
surfaceTextureHelper = SurfaceTextureHelper.create("CaptureThread", eglBaseContext);
// create VideoCapturer
videoCapturer = createScreenCapturer();
VideoSource videoSource =
peerConnectionFactory.createVideoSource(videoCapturer.isScreencast());
localVideoTrack = peerConnectionFactory.createVideoTrack("118", videoSource);
videoCapturer.initialize(surfaceTextureHelper, context, videoSource.getCapturerObserver());
videoCapturer.startCapture(720, 1280, 30);
peerConnectionClient.setLocalVideoTrack(localVideoTrack);
peerConnectionClient.localVideoSender.setTrack(localVideoTrack, true); //true for taking ownership and replacing the existing track
}
It buffers screen sharing video for 2-3 seconds and stops throwing source error at the subscribers end. Basically no chunks available on server to further buffer.
I have already taken required permission for screen sharing before hitting the above code.
startActivityForResult(
mMediaProjectionManager!!.createScreenCaptureIntent(),
SCREEN_RECORD_REQUEST_CODE
)
This is the below code I'm using to call the above method on onActivityResult :
intent.putExtra(CallActivity.EXTRA_SCREENCAPTURE, true)
webRTCClient.setMediaProjectionParams(resultCode, data)
webRTCClient.MakeScreenCaptureReady()
How do I achieve switching between camera and screen capture? Any help is much appreciated. Thanks!
Related
I am using setMaxBitrate provided by DefaultTrackSelector to set max bit rate when user changes video quality.
val parameters = defaultTrackSelector.buildUponParameters()
.setMaxVideoBitrate(bitrate)
.build()
defaultTrackSelector.parameters = parameters
But as soon as this function is called, the current buffer is discarded & re-buffering is shown right away. Is there any way to keep playing using old buffer & just load the new buffer using the new bitrate settings like YouTube does?
This issue has been discussed here:
https://github.com/google/ExoPlayer/issues/3522
https://github.com/google/ExoPlayer/issues/2250
But there doesn't seem to be any solution yet. Any help regarding this issue would be appreciated. Thanks in advance.
Easily you can do it.You have to use ExoPlayer already and ExoPlayer is provide a seekTo() method.
On this method,You should pass only player current position at which point you stopped before.
Step:-1
You have to change your Quality like 144p to 720p. on this Changing time you have to store your current ExoPlayer current position used this method:-
Private int currentPosition=player.getCurrentPosition();
Step -2
After you have to build your exoplayer media source:-
// Measures bandwidth during playback. Can be null if not required.
DefaultBandwidthMeter bandwidthMeter = new DefaultBandwidthMeter();
// Produces DataSource instances through which media data is loaded.
DataSource.Factory dataSourceFactory = new DefaultDataSourceFactory(this, Util.getUserAgent(this, getString(R.string.app_name)), bandwidthMeter);
// This is the MediaSource representing the media to be played.
MediaSource videoSource = new ExtractorMediaSource.Factory(dataSourceFactory).createMediaSource("Pass Your Video Url");
// Prepare the player with the source.
player.prepare(videoSource);
Step 3:-
check this condition
if (this.currentPosition > 0) {
player.seekTo(this.currentPosition);
this.currentPosition = 0;
player.setPlayWhenReady(true);
} else {
player.setPlayWhenReady(true);
}
and it's work good you have to watch your video in where are you left.
Step 4:-
If your quality his not good that time used is method.
public int getWifiLevel()
{
WifiManager wifiManager = (WifiManager) context.getSystemService(Context.WIFI_SERVICE);
int linkSpeed = wifiManager.getConnectionInfo().getRssi();
int level = WifiManager.calculateSignalLevel(linkSpeed, 5);
return level;
}
Based on wifi level or link speed you can decide if it has the low connection or high connection internet.
I've been working on an Android application that shows live streaming video via RTSP.
Assuming I have a well-functioning RTSP server that passes h264 packets, and to view the stream we should connect to rtsp://1.2.3.4:5555/stream
So I tried to use the native MediaPlayer\VideoView, but no luck (the video was stuck after 2-3 seconds of playback, so I loaded mrmaffen's vlc-android-sdk (can be found here) and used the following code:
ArrayList<String> options = new ArrayList<String>();
options.add("--no-drop-late-frames");
options.add("--no-skip-frames");
options.add("-vvv");
videoVlc = new LibVLC(options);
newVideoMediaPlayer = new org.videolan.libvlc.MediaPlayer(videoVlc);
final IVLCVout vOut = newVideoMediaPlayer.getVLCVout();
vOut.addCallback(this);
vOut.setVideoView(videoView); //videoView is a pre-defined view which is part of the layout
vOut.attachViews();
newVideoMediaPlayer.setEventListener(this);
Media videoMedia = new Media (videoVlc, Uri.parse(mVideoPath));
newVideoMediaPlayer.setMedia(videoMedia);
newVideoMediaPlayer.play();
The problem is that I see a blank screen.
Keep in mind that when I put a RTSP link with audio stream only, it works fine.
Is someone familliar with this sdk and have an idea about this issue?
Thanks in advance
Try adding this option:
--rtsp-tcp
I play rtsp streaming with following code
try {
Uri rtspUri=Uri.parse("rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov");
final MediaWrapper mw = new MediaWrapper(rtspUri);
mw.removeFlags(MediaWrapper.MEDIA_FORCE_AUDIO);
mw.addFlags(MediaWrapper.MEDIA_VIDEO);
MediaWrapperListPlayer.getInstance().getMediaList().add(mw);
VLCInstance.getMainMediaPlayer().setEventListener(this);
VLCInstance.get().setOnHardwareAccelerationError(this);
final IVLCVout vlcVout = VLCInstance.getMainMediaPlayer().getVLCVout();
vlcVout.addCallback(this);
vlcVout.setVideoView(mSurfaceView);
vlcVout.attachViews();
final SharedPreferences pref = PreferenceManager.getDefaultSharedPreferences(this);
final String aout = VLCOptions.getAout(pref);
VLCInstance.getMainMediaPlayer().setAudioOutput(aout);
MediaWrapperListPlayer.getInstance().playIndex(this, 0);
} catch (Exception e) {
Log.e(TAG, e.toString());
}
When you get playing event, you need enable video track.
private void onPlaying() {
stopLoadingAnimation();
VLCInstance.getMainMediaPlayer().setVideoTrackEnabled(true);
}
This may be helpful for you
I want to use ExoPlayer in my app. Could you please tell me which is simplest example? I have tried to do likely https://github.com/google/ExoPlayer/ but it's not easy for me. I tried to import library as module then i received bintray-release error.
As stated in the main Readme.md, you can import ExoPlayer as you will do for any other dependencies :
In your app build.gradle > dependencies add :
compile 'com.google.android.exoplayer:exoplayer:rX.X.X'
The current version is r1.5.1 as of October 27, 2015. see here.
Old question but since there are too few simple ExoPlayer tutorials out there, I wrote this up. I recently converted an app I have from using Android's default media player to ExoPlayer. The performance gains are amazing and it works on a wider range of devices. It is a bit more complicated, however.
This example is tailored specifically to playing an http audio stream but by experimenting you can probably adapt it easily to anything else. This example uses the latest v1.xx of ExoPlayer, currently v1.5.11:
First, put this in your build.gradle (Module: app) file, under "dependencies":
compile 'com.google.android.exoplayer:exoplayer:r1.5.11'
Also your class should implement ExoPlayer.Listener:
...implements ExoPlayer.Listener
Now here's the relevant code to play an http audio stream:
private static final int RENDERER_COUNT = 1; //since we want to render simple audio
private static final int BUFFER_SEGMENT_SIZE = 64 * 1024; // for http mp3 audio stream use these values
private static final int BUFFER_SEGMENT_COUNT = 256; // for http mp3 audio steam use these values
private ExoPlayer exoPlayer;
// for http mp3 audio stream, use these values
int minBufferMs = 1000;
int minRebufferMs = 5000;
// Prepare ExoPlayer
exoPlayer = ExoPlayer.Factory.newInstance(RENDERER_COUNT, minBufferMs, minRebufferMs);
// String with the url of the stream to play
String stream_location = "http://audio_stream_url";
// Convert String URL to Uri
Uri streamUri = Uri.parse(stream_location);
// Settings for ExoPlayer
Allocator allocator = new DefaultAllocator(BUFFER_SEGMENT_SIZE);
String userAgent = Util.getUserAgent(ChicagoPoliceRadioService.this, "ExoPlayer_Test");
DataSource dataSource = new DefaultUriDataSource(ChicagoPoliceRadioService.this, null, userAgent);
ExtractorSampleSource sampleSource = new ExtractorSampleSource(
streamUri, dataSource, allocator, BUFFER_SEGMENT_SIZE * BUFFER_SEGMENT_COUNT);
MediaCodecAudioTrackRenderer audioRenderer = new MediaCodecAudioTrackRenderer(sampleSource, MediaCodecSelector.DEFAULT);
// Attach listener we implemented in this class to this ExoPlayer instance
exoPlayer.addListener(this);
// Prepare ExoPlayer
exoPlayer.prepare(audioRenderer);
// Set full volume
exoPlayer.sendMessage(audioRenderer, MediaCodecAudioTrackRenderer.MSG_SET_VOLUME, 1f);
// Play!
exoPlayer.setPlayWhenReady(true);
There are three callback methods:
#Override
public void onPlayWhenReadyCommitted() {
// No idea what would go here, I left it empty
}
// Called when ExoPlayer state changes
#Override
public void onPlayerStateChanged(boolean playWhenReady, int playbackState) {
// If playbackState equals STATE_READY (4), that means ExoPlayer is set to
// play and there are no errors
if (playbackState == ExoPlayer.STATE_READY) {
// ExoPlayer prepared and ready, no error
// Put code here, same as "onPrepared()"
}
}
// Called on ExoPlayer error
#Override
public void onPlayerError(ExoPlaybackException error) {
// ExoPlayer error occurred
// Put your error code here
}
And when you're done playing do the usual:
if (exoPlayer != null) {
exoPlayer.stop();
exoPlayer.release();
}
NOTE: I'm still not 100% sure about the details of all of the ExoPlayer settings. I've never tried playing video. Note this is for version 1.5.x of ExoPlayer, 2.0 changed a lot and I still haven't figured it out. I do highly recommend this code to anyone who has an app that streams audio from the web as the performance gains are incredible and for my app it fixed an issue with Samsung phones that would only play about 30sec of audio before stopping.
Background
Android got a new API on Kitkat and Lollipop, to video capture the screen. You can do it either via the ADB tool or via code (starting from Lollipop).
Ever since the new API was out, many apps came to that use this feature, allowing to record the screen, and Microsoft even made its own Google-Now-On-tap competitor app.
Using ADB, you can use:
adb shell screenrecord /sdcard/video.mp4
You can even do it from within Android Studio itself.
The problem
I can't find any tutorial or explanation about how to do it using the API, meaning in code.
What I've found
The only place I've found is the documentations (here, under "Screen capturing and sharing"), telling me this:
Android 5.0 lets you add screen capturing and screen sharing
capabilities to your app with the new android.media.projection APIs.
This functionality is useful, for example, if you want to enable
screen sharing in a video conferencing app.
The new createVirtualDisplay() method allows your app to capture the
contents of the main screen (the default display) into a Surface
object, which your app can then send across the network. The API only
allows capturing non-secure screen content, and not system audio. To
begin screen capturing, your app must first request the user’s
permission by launching a screen capture dialog using an Intent
obtained through the createScreenCaptureIntent() method.
For an example of how to use the new APIs, see the MediaProjectionDemo
class in the sample project.
Thing is, I can't find any "MediaProjectionDemo" sample. Instead, I've found "Screen Capture" sample, but I don't understand how it works, as when I've run it, all I've seen is a blinking screen and I don't think it saves the video to a file. The sample seems very buggy.
The questions
How do I perform those actions using the new API:
start recording, optionally including audio (mic/speaker/both).
stop recording
take a screenshot instead of video.
Also, how do I customize it (resolution, requested fps, colors, time...)?
First step and the one which Ken White rightly suggested & which you may have already covered is the Example Code provided officially.
I have used their API earlier. I agree screenshot is pretty straight forward. But, screen recording is also under similar lines.
I will answer your questions in 3 sections and will wrap it up with a link. :)
1. Start Video Recording
private void startScreenRecord(final Intent intent) {
if (DEBUG) Log.v(TAG, "startScreenRecord:sMuxer=" + sMuxer);
synchronized(sSync) {
if (sMuxer == null) {
final int resultCode = intent.getIntExtra(EXTRA_RESULT_CODE, 0);
// get MediaProjection
final MediaProjection projection = mMediaProjectionManager.getMediaProjection(resultCode, intent);
if (projection != null) {
final DisplayMetrics metrics = getResources().getDisplayMetrics();
final int density = metrics.densityDpi;
if (DEBUG) Log.v(TAG, "startRecording:");
try {
sMuxer = new MediaMuxerWrapper(".mp4"); // if you record audio only, ".m4a" is also OK.
if (true) {
// for screen capturing
new MediaScreenEncoder(sMuxer, mMediaEncoderListener,
projection, metrics.widthPixels, metrics.heightPixels, density);
}
if (true) {
// for audio capturing
new MediaAudioEncoder(sMuxer, mMediaEncoderListener);
}
sMuxer.prepare();
sMuxer.startRecording();
} catch (final IOException e) {
Log.e(TAG, "startScreenRecord:", e);
}
}
}
}
}
2. Stop Video Recording
private void stopScreenRecord() {
if (DEBUG) Log.v(TAG, "stopScreenRecord:sMuxer=" + sMuxer);
synchronized(sSync) {
if (sMuxer != null) {
sMuxer.stopRecording();
sMuxer = null;
// you should not wait here
}
}
}
2.5. Pause and Resume Video Recording
private void pauseScreenRecord() {
synchronized(sSync) {
if (sMuxer != null) {
sMuxer.pauseRecording();
}
}
}
private void resumeScreenRecord() {
synchronized(sSync) {
if (sMuxer != null) {
sMuxer.resumeRecording();
}
}
}
Hope the code helps. Here is the original link to the code that I referred to and from which this implementation(Video recording) is also derived from.
3. Take screenshot Instead of Video
I think by default its easy to capture the image in bitmap format. You can still go ahead with MediaProjectionDemo example to capture screenshot.
[EDIT] : Code encrypt for screenshot
a. To create virtual display depending on device width / height
mImageReader = ImageReader.newInstance(mWidth, mHeight, PixelFormat.RGBA_8888, 2);
mVirtualDisplay = sMediaProjection.createVirtualDisplay(SCREENCAP_NAME, mWidth, mHeight, mDensity, VIRTUAL_DISPLAY_FLAGS, mImageReader.getSurface(), null, mHandler);
mImageReader.setOnImageAvailableListener(new ImageAvailableListener(), mHandler);
b. Then start the Screen Capture based on an intent or action-
startActivityForResult(mProjectionManager.createScreenCaptureIntent(), REQUEST_CODE);
Stop Media projection-
sMediaProjection.stop();
c. Then convert to image-
//Process the media capture
image = mImageReader.acquireLatestImage();
Image.Plane[] planes = image.getPlanes();
ByteBuffer buffer = planes[0].getBuffer();
int pixelStride = planes[0].getPixelStride();
int rowStride = planes[0].getRowStride();
int rowPadding = rowStride - pixelStride * mWidth;
//Create bitmap
bitmap = Bitmap.createBitmap(mWidth + rowPadding / pixelStride, mHeight, Bitmap.Config.ARGB_8888);
bitmap.copyPixelsFromBuffer(buffer);
//Write Bitmap to file in some path on the phone
fos = new FileOutputStream(STORE_DIRECTORY + "/myscreen_" + IMAGES_PRODUCED + ".png");
bitmap.compress(CompressFormat.PNG, 100, fos);
fos.close();
There are several implementations (full code) of Media Projection API available.
Some other links that can help you in your development-
Video Recording with MediaProjectionManager - website
android-ScreenCapture - github as per android developer's observations :)
screenrecorder - github
Capture and Record Android Screen using MediaProjection APIs - website
Hope it helps :) Happy coding and screen recording!
PS: Can you please tell me the Microsoft app you are talking about? I have not used it. Would like to try it :)
I have heard about screen sharing on desktop using WebRTC. But for the Android, it seems not to have much information.
My question is:
Is it possible to use WebRTC for screen sharing on android?. I mean I can cast the current screen to the other phone's screen.
If 1 is Yes, How can I achieve this?
Thanks.
It is possible!
It can be done using the directions below.
I've used ScreenShareRTC in conjunction with ProjectRTC to stream the contents of the screen to a browser with decent quality and fairly low latency ~100ms.
I've added an example below that shows how to configure a screen share as a video source and add it as a track on a stream.
Get the VideoCapturer
#TargetApi(21)
private VideoCapturer createScreenCapturer() {
if (mMediaProjectionPermissionResultCode != Activity.RESULT_OK) {
report("User didn't give permission to capture the screen.");
return null;
}
return new ScreenCapturerAndroid(
mMediaProjectionPermissionResultData, new MediaProjection.Callback() {
#Override
public void onStop() {
report("User revoked permission to capture the screen.");
}
});
}
Initialize the capturer and add the tracks to the local media stream
private void initScreenCapturStream() {
mLocalMediaStream = factory.createLocalMediaStream("ARDAMS");
MediaConstraints videoConstraints = new MediaConstraints();
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxHeight", Integer.toString(mPeerConnParams.videoHeight)));
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxWidth", Integer.toString(mPeerConnParams.videoWidth)));
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxFrameRate", Integer.toString(mPeerConnParams.videoFps)));
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("minFrameRate", Integer.toString(mPeerConnParams.videoFps)));
mVideoSource = factory.createVideoSource(videoCapturer);
videoCapturer.startCapture(mPeerConnParams.videoWidth, mPeerConnParams.videoHeight, mPeerConnParams.videoFps);
VideoTrack localVideoTrack = factory.createVideoTrack(VIDEO_TRACK_ID, mVideoSource);
localVideoTrack.setEnabled(true);
mLocalMediaStream.addTrack(factory.createVideoTrack("ARDAMSv0", mVideoSource));
AudioSource audioSource = factory.createAudioSource(new MediaConstraints());
mLocalMediaStream.addTrack(factory.createAudioTrack("ARDAMSa0", audioSource));
mListener.onStatusChanged("STREAMING");
}
For more information this might be a good place to start. Its a Android project that connects to a ProjectRTC signalling server and shares the screen as video. I found it very helpful!
Android screen sharing project(Android client - Java)
https://github.com/Jeffiano/ScreenShareRTC
ProjectRTC(Node server)
https://github.com/pchab/ProjectRTC