I want to have possibility to enable/disable sound during call in Android client, which uses WebRTC.
I tried to do it:
LinkedList<org.webrtc.AudioTrack> tracks = audioStream.audioTracks;
for (int i=0; i<tracks.size(); i++) {
Log.d(TAG, "track: " + i);
tracks.get(i).setEnabled(false);
}
But this case doesn't work.
May be someone know how to do it?
Seems that this method doesn't work for native app.
Here is example, of enabling, disabling audio tracks based on PeerConnection.
private void createHold(boolean action, int connectionId) {*/
for (RtpReceiver rtpReceiver : peerConnections[connectionId].getReceivers()) {
rtpReceiver.track().setEnabled(action);
}
for (RtpSender rtpSender : peerConnections[connectionId].getSenders()) {
rtpSender.track().setEnabled(action);
}
}
Related
I can't find much documentation on the process of getting all of the media tracks (video audio and subtitles) using libvlc on android.
From what I understand, I have to parse the media, and I'm doing it like this:
Media media = new Media(libVLC, Uri.parse(url));
media.setEventListener(new IMedia.EventListener() {
#Override
public void onEvent(IMedia.Event event) {
switch (event.type){
case IMedia.Event.ParsedChanged:
if(event.getParsedStatus() == IMedia.ParsedStatus.Done){
Log.i("App", "Parse done, track count " + media.getTrackCount());
Gson gson = new Gson();
for(int i=0; i<media.getTrackCount(); i++){
Log.i("App", "Track " + i + ": " + gson.toJson(media.getTrack(i)));
}
}
break;
}
}
});
media.parseAsync();
vlc.setMedia(media);
vlc.play();
The results I get from this are odd: sometimes I get one track only, the video track, but sometimes I also get the audio track, so two tracks total.
The problem is that the media also have a subtitle track, so there must be a way for me to get all three tracks (Playing the same exact media with vlc on windows shows, indeed, all three tracks).
What am I doing wrong?
Edit: I need a way to dynamically get all tracks, the media could have n tracks so I don't know the exact number. This is just a test and I know there are three tracks.
Thanks
If you are not able to get the tracks from media, use VLC MediaPlayer object, VLC media player provides methods to get Audio Tracks, Video Tracks and Subtitle tracks using MediaPlayer object.
mMediaPlayer!!.setEventListener {
when (p0?.type) {
MediaPlayer.Event.Opening-> {
val audioTracks = mMediaPlayer!!.audioTracks
val subtitleTracks = mMediaPlayer!!.spuTracks
val videoTracks = mMediaPlayer!!.videoTracks
}
}
You can iterate over the lists to get individual tracks.
I am developing native android WebRTC client that is suppoded to stream audio from custom device (I am getting audio stream via Bluetooth from that device). I am using libjingle library to implement WebRTC and I wonder if and how it is possible to hook up custom audio stream to audio track?
Currently I am adding default audio track like this:
localMS = factory.createLocalMediaStream("ARDAMS");
AudioSource audioSource = factory.createAudioSource(new MediaConstraints());
localMS.addTrack(factory.createAudioTrack("ARDAMSa0", audioSource));
I saw that there is WebRtcAuidioRecord (https://github.com/pristineio/webrtc-android/blob/master/libjingle_peerconnection/src/main/java/org/webrtc/voiceengine/WebRtcAudioRecord.java) - is it possible to override it?
Anybody tried doing something like that?
Your post lead me to the below code, I am going to try it and let you know if I get it to work. I am trying to send one audio stream to Watson API and one to WebRTC but Android only lets one InputStream read for the microphone. I will update you if I get it to work.
private org.webrtc.MediaStream createMediaStream() {
org.webrtc.MediaStream mediaStream = mFactory.createLocalMediaStream(ARDAMS);
if (mEnableVideo) {
mVideoCapturer = createVideoCapturer();
if (mVideoCapturer != null) {
mediaStream.addTrack(createVideoTrack(mVideoCapturer));
} else {
mEnableVideo = false;
}
}
if (mEnableAudio) {
createAudioCapturer();
mediaStream.addTrack(mFactory.createAudioTrack(
AUDIO_TRACK_ID,
mFactory.createAudioSource(mAudioConstraints)));
}
return mediaStream;
}
/**
* Creates a instance of WebRtcAudioRecord.
*/
private void createAudioCapturer() {
if (mOption.getAudioType() == PeerOption.AudioType.EXTERNAL_RESOURCE) {
WebRtcAudioRecord.setAudioRecordModuleFactory(new WebRtcAudioRecordModuleFactory() {
#Override
public WebRtcAudioRecordModule create() {
AudioCapturerExternalResource module = new AudioCapturerExternalResource();
module.setUri(mOption.getAudioUri());
module.setSampleRate(mOption.getAudioSampleRate());
module.setBitDepth(mOption.getAudioBitDepth());
module.setChannel(mOption.getAudioChannel());
return module;
}
});
} else {
WebRtcAudioRecord.setAudioRecordModuleFactory(null);
}
}
Source:
https://www.programcreek.com/java-api-examples/?code=DeviceConnect/DeviceConnect-Android/DeviceConnect-Android-master/dConnectDevicePlugin/dConnectDeviceWebRTC/app/src/main/java/org/deviceconnect/android/deviceplugin/webrtc/core/MediaStream.java
I have a receiver app (V2) that works fine when you show the first video, but when you go to show a second video I get this:
[cast.receiver.platform.WebSocket] PlatformChannel Already open
I am unloading and loading the player each time. I can't see any way to explicitly ask the PlatformChannel to close. Here's the relevant code from the function that starts play:
this.receiverManager.start()
this.host = new cast.player.api.Host({'mediaElement':this.refs.video, 'url':source})
this.host.onError = function(errorCode) {
console.log("Fatal Error - " + errorCode)
if (window.player) {
window.player.unload()
window.player = null
}
}
this.host.updateSegmentRequestInfo = function(requestInfo) {
requestInfo.withCredentials = false;
}
if(!window.player) {
window.player = new cast.player.api.Player(this.host)
}
this.receiverManager.setApplicationState('Ready To Cast');
this.protocol = cast.player.api.CreateDashStreamingProtocol(this.host)
window.player.load(this.protocol, 0)
We highly recommend that you move to a CAF receiver. Also, CAF has a new queueing API that will handle a playlist of videos.
We have a native Android app that uses WebRTC, and we need to find out what video codecs are supported by the host device. (VP8 is always supported but H.264 is subject to the device having a compatible chipset.)
The idea is to create an offer and get the supported video codecs from the SDP. We can do this in a web app as follows:
const pc = new RTCPeerConnection();
if (pc.addTransceiver) {
pc.addTransceiver('video');
pc.addTransceiver('audio');
}
pc.createOffer(...);
Is there a way to do something similar on Android? It's important that we don't need to request camera access to create the offer.
Create a VideoEncoderFactory object and call getSupportedCodecs(). This will return a list of codecs that can be used. Be sure to create the PeerConnectionFactory first.
PeerConnectionFactory.InitializationOptions initializationOptions =
PeerConnectionFactory.InitializationOptions.builder(this)
.setEnableVideoHwAcceleration(true)
.createInitializationOptions();
PeerConnectionFactory.initialize(initializationOptions);
VideoEncoderFactory videoEncoderFactory =
new DefaultVideoEncoderFactory(eglBase.getEglBaseContext()
, true, true);
for (int i = 0; i < videoEncoderFactory.getSupportedCodecs().length; i++) {
Log.d("Codecs", "Supported codecs: " + videoEncoderFactory.getSupportedCodecs()[i].name);
}
I think this is what you are looking for:
private static void codecs() {
MediaCodecInfo[] codecInfos = new MediaCodecList(MediaCodecList.ALL_CODECS).getCodecInfos();
for(MediaCodecInfo codecInfo : codecInfos) {
Log.i("Codec", codecInfo.getName());
for(String supportedType : codecInfo.getSupportedTypes()){
Log.i("Codec", supportedType);
}
}
}
You can check example on https://developer.android.com/reference/android/media/MediaCodecInfo.html
After struggling a few hours on making my app detect this QRCode:
I realized that the problem was the in the QRCode appearance. After inverting the colors, the detection was working perfectly..
Is there a way to make Vision API detect the first QRCode? I tried to enable all symbologies but it did not work. I guess it is possible because the app QR Code Reader detects it.
I improved googles example app "barcode-reader" to detect both inverted colored barcodes and regular ones.
here is a link to googles example app:
https://github.com/googlesamples/android-vision/tree/master/visionSamples/barcode-reader
I did so by editing "CameraSource" class,
package: "com.google.android.gms.samples.vision.barcodereader.ui.camera".
I added a parameter: private boolean isInverted = false;
and changed function void setNextFrame(byte[] data, Camera camera):
void setNextFrame(byte[] data, Camera camera) {
synchronized (mLock) {
if (mPendingFrameData != null) {
camera.addCallbackBuffer(mPendingFrameData.array());
mPendingFrameData = null;
}
if (!mBytesToByteBuffer.containsKey(data)) {
Log.d(TAG,
"Skipping frame. Could not find ByteBuffer associated with the image " +
"data from the camera.");
return;
}
mPendingTimeMillis = SystemClock.elapsedRealtime() - mStartTimeMillis;
mPendingFrameId++;
if (!isInverted){
for (int y = 0; y < data.length; y++) {
data[y] = (byte) ~data[y];
}
isInverted = true;
} else {
isInverted = false;
}
mPendingFrameData = mBytesToByteBuffer.get(data);
// Notify the processor thread if it is waiting on the next frame (see below).
mLock.notifyAll();
}
}
I think this is still an open issue, please see link for details. One workaround for this as stated by a developer:
Right, the barcode API generally doesn't support color-inverted codes. There's no parameter or option to control this at the moment. Though some APIs support them, I don't believe it's a common feature.
For a workaround, you could preprocess the colors in the bitmap before passing them to the barcode API (perhaps inverting colors on alternate frames).
Hope this helps.