Android WebRTC Local AudioTrack is not streaming audio - android

I am developing an android app with Audio call feature by using WebRTC. I have written the code with latest build available for android. When i receive a call from client, i am able to Connect client successfully. Problem is, i can hear client audio but my audio is not streaming to client. Below is the code.
factory = PeerConnectionFactory.builder()
.setAudioDeviceModule(adm)
.setOptions(options)
.createPeerConnectionFactory();
peerConnection = createPeerConnection(factory);
if(peerConnection != null){
remoteAudioSource = factory.createAudioSource(new MediaConstraints());
remoteAudioTrack = factory.createAudioTrack("5555",remoteAudioSource);
//Local audio track
List<String> mediaStreamLabels = Collections.singletonList("ARDAMS");
peerConnection.addTrack(createAudioTrack(),mediaStreamLabels);
peerConnection.setAudioRecording(true);
peerConnection.setAudioPlayout(true);
}
private AudioTrack createAudioTrack() {
localAudioSource = factory.createAudioSource(new MediaConstraints());
localAudioTrack = factory.createAudioTrack(AUDIO_TRACK_ID, localAudioSource);
localAudioTrack.setEnabled(true);//Enable audio
return localAudioTrack;
}

Related

android antmedia webrtc switch camera to Screen Sharing and vice versa

I'm using webrtc-android-framework module provided by Antmedia official website. I was able to make connection and I can see the video published on the other side without any issues. However I'm unable to switch from camera to screen sharing.
I'm using below code to switch from camera capture to screen sharing.
public void MakeScreenCaptureReady() {
final EglBase.Context eglBaseContext = eglBase.getEglBaseContext();
PeerConnectionFactory peerConnectionFactory = peerConnectionClient.factory;
// create AudioSource
AudioSource audioSource = peerConnectionFactory.createAudioSource(new MediaConstraints());
this.audioTrack = peerConnectionFactory.createAudioTrack("101", audioSource);
surfaceTextureHelper = SurfaceTextureHelper.create("CaptureThread", eglBaseContext);
// create VideoCapturer
videoCapturer = createScreenCapturer();
VideoSource videoSource =
peerConnectionFactory.createVideoSource(videoCapturer.isScreencast());
localVideoTrack = peerConnectionFactory.createVideoTrack("118", videoSource);
videoCapturer.initialize(surfaceTextureHelper, context, videoSource.getCapturerObserver());
videoCapturer.startCapture(720, 1280, 30);
peerConnectionClient.setLocalVideoTrack(localVideoTrack);
peerConnectionClient.localVideoSender.setTrack(localVideoTrack, true); //true for taking ownership and replacing the existing track
}
It buffers screen sharing video for 2-3 seconds and stops throwing source error at the subscribers end. Basically no chunks available on server to further buffer.
I have already taken required permission for screen sharing before hitting the above code.
startActivityForResult(
mMediaProjectionManager!!.createScreenCaptureIntent(),
SCREEN_RECORD_REQUEST_CODE
)
This is the below code I'm using to call the above method on onActivityResult :
intent.putExtra(CallActivity.EXTRA_SCREENCAPTURE, true)
webRTCClient.setMediaProjectionParams(resultCode, data)
webRTCClient.MakeScreenCaptureReady()
How do I achieve switching between camera and screen capture? Any help is much appreciated. Thanks!

Custom audio device in android WebRTC using libjingle

I am developing native android WebRTC client that is suppoded to stream audio from custom device (I am getting audio stream via Bluetooth from that device). I am using libjingle library to implement WebRTC and I wonder if and how it is possible to hook up custom audio stream to audio track?
Currently I am adding default audio track like this:
localMS = factory.createLocalMediaStream("ARDAMS");
AudioSource audioSource = factory.createAudioSource(new MediaConstraints());
localMS.addTrack(factory.createAudioTrack("ARDAMSa0", audioSource));
I saw that there is WebRtcAuidioRecord (https://github.com/pristineio/webrtc-android/blob/master/libjingle_peerconnection/src/main/java/org/webrtc/voiceengine/WebRtcAudioRecord.java) - is it possible to override it?
Anybody tried doing something like that?
Your post lead me to the below code, I am going to try it and let you know if I get it to work. I am trying to send one audio stream to Watson API and one to WebRTC but Android only lets one InputStream read for the microphone. I will update you if I get it to work.
private org.webrtc.MediaStream createMediaStream() {
org.webrtc.MediaStream mediaStream = mFactory.createLocalMediaStream(ARDAMS);
if (mEnableVideo) {
mVideoCapturer = createVideoCapturer();
if (mVideoCapturer != null) {
mediaStream.addTrack(createVideoTrack(mVideoCapturer));
} else {
mEnableVideo = false;
}
}
if (mEnableAudio) {
createAudioCapturer();
mediaStream.addTrack(mFactory.createAudioTrack(
AUDIO_TRACK_ID,
mFactory.createAudioSource(mAudioConstraints)));
}
return mediaStream;
}
/**
* Creates a instance of WebRtcAudioRecord.
*/
private void createAudioCapturer() {
if (mOption.getAudioType() == PeerOption.AudioType.EXTERNAL_RESOURCE) {
WebRtcAudioRecord.setAudioRecordModuleFactory(new WebRtcAudioRecordModuleFactory() {
#Override
public WebRtcAudioRecordModule create() {
AudioCapturerExternalResource module = new AudioCapturerExternalResource();
module.setUri(mOption.getAudioUri());
module.setSampleRate(mOption.getAudioSampleRate());
module.setBitDepth(mOption.getAudioBitDepth());
module.setChannel(mOption.getAudioChannel());
return module;
}
});
} else {
WebRtcAudioRecord.setAudioRecordModuleFactory(null);
}
}
Source:
https://www.programcreek.com/java-api-examples/?code=DeviceConnect/DeviceConnect-Android/DeviceConnect-Android-master/dConnectDevicePlugin/dConnectDeviceWebRTC/app/src/main/java/org/deviceconnect/android/deviceplugin/webrtc/core/MediaStream.java

Set min-buffer threshold for http/network playback

In a video player project, I'd like to use LibVLC http streaming from a slow source.
However, I can not get it to stream FIRST, and then continuously download data. The player will always stop in-between.
I'm using vlc-android from GIT.
This is the media player setup code:
ArrayList<String> options = new ArrayList<>();
options.add("--no-sub-autodetect-file");
options.add("--swscale-mode=0");
options.add("--network-caching=60000");
if (BuildConfig.DEBUG) {
options.add("-vvv"); // verbosity
}
libVLC = new LibVLC(options);
mediaPlayer = new org.videolan.libvlc.MediaPlayer(libVLC);
mediaPlayer.setEventListener(this);
final IVLCVout vout = mediaPlayer.getVLCVout();
vout.setVideoView(videoView);
vout.setSubtitlesView(subtitleView);
vout.addCallback(this);
vout.attachViews();
final Media media = new Media(libVLC, getIntent().getData());
media.setHWDecoderEnabled(true, false);
media.addOption(":network-caching=60000");
media.addOption(":clock-jitter=0");
media.addOption(":clock-synchro=0");
mediaPlayer.setMedia(media);
mediaPlayer.play();
I was hoping that setting the :network-caching on the media object is enough, but it seems to still run out of data the whole time.
How to configure LibVLC so that the stutter is eliminated? Some buffer time is OK.
The stream type is a MOV file served via HTTP.
libvlc option try it:
ArrayList<String> options = new ArrayList<String>();
options.add("--audio-time-stretch"); // time stretching
options.add("-vvv"); // verbosity
options.add("--no-audio"); // no audio
options.add("--aout=none");
options.add("--no-sub-autodetect-file");
options.add("--swscale-mode=0");
options.add("--network-caching=400");
options.add("--no-drop-late-frames");
options.add("--no-skip-frames");
options.add("--avcodec-skip-frame");
options.add("--avcodec-hw=any");
media addOption try it:
Media m = new Media(libvlc, Uri.parse(URL));
m.setHWDecoderEnabled(true, true);
m.addOption(":network-caching=5000");
m.addOption(":clock-jitter=0");
m.addOption(":clock-synchro=0");
m.addOption(":codec=all");
mMediaPlayer.setMedia(m);
mMediaPlayer.play();

vlc-android-sdk - cannot view RTSP live video

I've been working on an Android application that shows live streaming video via RTSP.
Assuming I have a well-functioning RTSP server that passes h264 packets, and to view the stream we should connect to rtsp://1.2.3.4:5555/stream
So I tried to use the native MediaPlayer\VideoView, but no luck (the video was stuck after 2-3 seconds of playback, so I loaded mrmaffen's vlc-android-sdk (can be found here) and used the following code:
ArrayList<String> options = new ArrayList<String>();
options.add("--no-drop-late-frames");
options.add("--no-skip-frames");
options.add("-vvv");
videoVlc = new LibVLC(options);
newVideoMediaPlayer = new org.videolan.libvlc.MediaPlayer(videoVlc);
final IVLCVout vOut = newVideoMediaPlayer.getVLCVout();
vOut.addCallback(this);
vOut.setVideoView(videoView); //videoView is a pre-defined view which is part of the layout
vOut.attachViews();
newVideoMediaPlayer.setEventListener(this);
Media videoMedia = new Media (videoVlc, Uri.parse(mVideoPath));
newVideoMediaPlayer.setMedia(videoMedia);
newVideoMediaPlayer.play();
The problem is that I see a blank screen.
Keep in mind that when I put a RTSP link with audio stream only, it works fine.
Is someone familliar with this sdk and have an idea about this issue?
Thanks in advance
Try adding this option:
--rtsp-tcp
I play rtsp streaming with following code
try {
Uri rtspUri=Uri.parse("rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov");
final MediaWrapper mw = new MediaWrapper(rtspUri);
mw.removeFlags(MediaWrapper.MEDIA_FORCE_AUDIO);
mw.addFlags(MediaWrapper.MEDIA_VIDEO);
MediaWrapperListPlayer.getInstance().getMediaList().add(mw);
VLCInstance.getMainMediaPlayer().setEventListener(this);
VLCInstance.get().setOnHardwareAccelerationError(this);
final IVLCVout vlcVout = VLCInstance.getMainMediaPlayer().getVLCVout();
vlcVout.addCallback(this);
vlcVout.setVideoView(mSurfaceView);
vlcVout.attachViews();
final SharedPreferences pref = PreferenceManager.getDefaultSharedPreferences(this);
final String aout = VLCOptions.getAout(pref);
VLCInstance.getMainMediaPlayer().setAudioOutput(aout);
MediaWrapperListPlayer.getInstance().playIndex(this, 0);
} catch (Exception e) {
Log.e(TAG, e.toString());
}
When you get playing event, you need enable video track.
private void onPlaying() {
stopLoadingAnimation();
VLCInstance.getMainMediaPlayer().setVideoTrackEnabled(true);
}
This may be helpful for you

Android screen sharing using WebRTC

I have heard about screen sharing on desktop using WebRTC. But for the Android, it seems not to have much information.
My question is:
Is it possible to use WebRTC for screen sharing on android?. I mean I can cast the current screen to the other phone's screen.
If 1 is Yes, How can I achieve this?
Thanks.
It is possible!
It can be done using the directions below.
I've used ScreenShareRTC in conjunction with ProjectRTC to stream the contents of the screen to a browser with decent quality and fairly low latency ~100ms.
I've added an example below that shows how to configure a screen share as a video source and add it as a track on a stream.
Get the VideoCapturer
#TargetApi(21)
private VideoCapturer createScreenCapturer() {
if (mMediaProjectionPermissionResultCode != Activity.RESULT_OK) {
report("User didn't give permission to capture the screen.");
return null;
}
return new ScreenCapturerAndroid(
mMediaProjectionPermissionResultData, new MediaProjection.Callback() {
#Override
public void onStop() {
report("User revoked permission to capture the screen.");
}
});
}
Initialize the capturer and add the tracks to the local media stream
private void initScreenCapturStream() {
mLocalMediaStream = factory.createLocalMediaStream("ARDAMS");
MediaConstraints videoConstraints = new MediaConstraints();
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxHeight", Integer.toString(mPeerConnParams.videoHeight)));
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxWidth", Integer.toString(mPeerConnParams.videoWidth)));
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxFrameRate", Integer.toString(mPeerConnParams.videoFps)));
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("minFrameRate", Integer.toString(mPeerConnParams.videoFps)));
mVideoSource = factory.createVideoSource(videoCapturer);
videoCapturer.startCapture(mPeerConnParams.videoWidth, mPeerConnParams.videoHeight, mPeerConnParams.videoFps);
VideoTrack localVideoTrack = factory.createVideoTrack(VIDEO_TRACK_ID, mVideoSource);
localVideoTrack.setEnabled(true);
mLocalMediaStream.addTrack(factory.createVideoTrack("ARDAMSv0", mVideoSource));
AudioSource audioSource = factory.createAudioSource(new MediaConstraints());
mLocalMediaStream.addTrack(factory.createAudioTrack("ARDAMSa0", audioSource));
mListener.onStatusChanged("STREAMING");
}
For more information this might be a good place to start. Its a Android project that connects to a ProjectRTC signalling server and shares the screen as video. I found it very helpful!
Android screen sharing project(Android client - Java)
https://github.com/Jeffiano/ScreenShareRTC
ProjectRTC(Node server)
https://github.com/pchab/ProjectRTC

Categories

Resources