I'm using AudioContext in an ionic app to stream raw PCM audio data to a backend, the resulting audio is has a lot of stutters if the client is google chrome on android. All other android browsers work fine (tried firefox, edge and samsung browser) also all desktop browsers work fine including google chrome.
startRecording() {
navigator.mediaDevices
.getUserMedia({
audio: {
echoCancellation: true,
},
})
.then((s) => {
this.stream = s;
this.record();
this.startRecordingStream(this.stream);
});
startRecordingStream(s) {
let audioContext = new AudioContext();
this.scriptProcessor = audioContext.createScriptProcessor(2048, 2, 1);
let input = audioContext.createMediaStreamSource(s);
input.connect(this.scriptProcessor);
this.scriptProcessor.connect(audioContext.destination);
this.scriptProcessor.addEventListener("audioprocess", this.streamAudioData);
}
I tried logging the buffer on the client side, it has zeros scattered all over it in a periodic manner.
I have also tested this example https://mozdevs.github.io/MediaRecorder-examples/filter-and-record-live-audio.html on my device, it uses audioContext and it seems to suffer from the same audio stutter on google chrome on android.
Is there any walkarounds to this?
Related
Device in question:
DJI RC Pro Enterprise used for controlling enterprise DJI drones
OS is Android 10, probably running stock Android as there's no Google Play services on the device
When establishing a WebRTC session as a master from Android 10 mobile device, the audio that should be gathered via microphone is not getting through to the viewer side. When I'm configuring transceiver on the viewer side, I do enforce both send & receive for the audio part via:
viewer.peerConnection.addTransceiver('audio', {direction: 'sendrecv'});
Also looking at chrome://webrtc-internals tab, it clearly shows (inbound-rtp stats for audio) that audio channel is open and some small amount of data is coming through:
But clearly we can spot that:
Bytes received in bits/s are only around 3k, where another Android device that is using newer Android version and the microphone sound is actually coming through on the viewer side is hitting around 30k bits/s.
Audio level is staying at 0 regardless how loud I speak into the microphone.
Here's also the stats from inbound-rtc in text:
inbound-rtp (kind=audio, mid=4, ssrc=3185575416, [codec]=opus (111, minptime=10;useinbandfec=1), id=IT31A3185575416)
Statistics IT31A3185575416
timestamp 2/16/2023, 2:51:44 PM
ssrc 3185575416
kind audio
trackId DEPRECATED_TI8
transportId T31
codecId CIT31_111_minptime=10;useinbandfec=1
[codec] opus (111, minptime=10;useinbandfec=1)
mediaType audio
jitter 0.063
packetsLost 0
trackIdentifier efe06737-ed24-448c-bf32-d002cef9171b
mid 4
packetsReceived 170
[packetsReceived/s] 13.550688631363338
packetsDiscarded 0
fecPacketsReceived 0
fecPacketsDiscarded 0
bytesReceived 5524
[bytesReceived_in_bits/s] 3523.179044154468
headerBytesReceived 4760
[headerBytesReceived_in_bits/s] 3035.354253425388
lastPacketReceivedTimestamp 1676555504833
[lastPacketReceivedTimestamp] 2/16/2023, 2:51:44 PM
jitterBufferDelay 77020.8
[jitterBufferDelay/jitterBufferEmittedCount_in_ms] 0
jitterBufferTargetDelay 270316.8
jitterBufferMinimumDelay 270240
jitterBufferEmittedCount 153600
totalSamplesReceived 665760
[totalSamplesReceived/s] 48782.47907290802
concealedSamples 505242
[concealedSamples/s] 48782.47907290802
[concealedSamples/totalSamplesReceived] 1
silentConcealedSamples 482592
[silentConcealedSamples/s] 48782.47907290802
concealmentEvents 14
insertedSamplesForDeceleration 6960
[insertedSamplesForDeceleration/s] 0
removedSamplesForAcceleration 0
[removedSamplesForAcceleration/s] 0
audioLevel 0
totalAudioEnergy 0
[Audio_Level_in_RMS] 0
totalSamplesDuration 13.869999999999749
jitterBufferFlushes* 2
delayedPacketOutageSamples* 452322
relativePacketArrivalDelay* 718.31
interruptionCount* 10
totalInterruptionDuration* 9.197
remoteId ROA3185575416
We have confirmed that microphone works on this device using third party tool microphone test app or just simply doing screen recording. We've also made sure on the master side that correct mode is set and that microphone is not muted via:
audioManager.requestAudioFocus(null, AudioManager.STREAM_VOICE_CALL,
AudioManager.AUDIOFOCUS_GAIN_TRANSIENT);
// Start by setting MODE_IN_COMMUNICATION as default audio mode. It is
// required to be in this mode when playout and/or recording starts for
// best possible VoIP performance.
audioManager.setMode(AudioManager.MODE_IN_COMMUNICATION);
audioManager.setMicrophoneMute(false);
Also using this method to check if microphone is available just before the start of the streaming returns True:
public static boolean getMicrophoneAvailable(Context context) {
MediaRecorder recorder = new MediaRecorder();
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.DEFAULT);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);
recorder.setOutputFile(new File(context.getCacheDir(), "MediaUtil#micAvailTestFile").getAbsolutePath());
boolean available = true;
try {
recorder.prepare();
recorder.start();
}
catch (Exception exception) {
available = false;
}
recorder.release();
return available;
}
The questions are therefore the following:
Does somehow running stock Android without Google Play services impact the usage of microphone in the WebRTC streaming session? I'm basing this assumption that Google is the main developer behind WebRTC and perhaps they've built some features around their Google Play services library
Given that microphone is clearly available, could I manually start the recording and started sending the audio bits via open WebRTC audio channel?
I'm trying to use the native webrtc SDK (libjingle) for android. So far i can send streams from android to web (or other platforms) just fine. i can also receive the MediaStream from a peer. (to the onAddStream callback)
The project I'm working on is requiring only audio streams. no video tracks are being created nor sent to anyone.
My question is, how do i play the MediaStream Object that i get from remote peers?
#Override
public void onAddStream(MediaStream mediaStream) {
Log.d(TAG, "onAddStream: got remote stream");
// Need to play the audio ///
}
Again, the question is about audio. I'm not using video.
apparently all the native webrtc examples uses video tracks, so i had no luck finding any documentation or examples on the web.
Thanks in advance!
We can get the Remote Audio Track using below code
import org.webrtc.AudioTrack;
#Override
public void onAddStream(final MediaStream stream){
if(stream.audioTracks.size() > 0) {
remoteAudioTrack = stream.audioTracks.get(0);
}
}
Apparently all the native webrtc examples uses video tracks, so i had no luck finding any documentation or examples on the web.
Yes, as an app developer we have to take care only video rendering.
If we have received Remote Audio Track, by default it will play in default speaker(ear speaker/loud speaker/wired headset) based proximity settings.
Check below code in AppRTCAudioManager.java
to enable/disable speaker
/** Sets the speaker phone mode. */
private void setSpeakerphoneOn(boolean on) {
boolean wasOn = audioManager.isSpeakerphoneOn();
if (wasOn == on) {
return;
}
audioManager.setSpeakerphoneOn(on);
}
Reference Source: AppRTCAudioManager.java
I am currently using an app that uses the method exemplified on libstreaming-example-1 (libstreaming) to stream the camera from an Android Device to an Ubuntu Server (using openCV and libVLC). This way, my Android device acts like a Server and waits for the Client (Ubuntu Server) to send the play signal over RTSP and then start the streaming over UDP.
The problem I am facing with the streaming is that I am getting a delay of approximately 1.1s during the transmission and I want to get it down to 150ms maximum.
I tried to implement the libstreaming-example-2 of libstreaming-examples, but I couldn't I don't have access to a detailed documentation and I couldn't figure out how to get the right signal to display the streaming on my server. Other than that, I was trying to see what I can do with the example 1 in order to get it down, but nothing new until now.
PS: I am using a LAN, so network/bandwidth is not the problem.
Here come the questions:
Which way is the best to get the lowest latency possible while
streaming video from the camera?
How can I implement example-2?
Is example-2 method of streaming better to get the latency down to
150ms?
Is this latency related to the decompression of the video on
the server side? (No frames are dropped, FPS: 30)
Thank you!
had same issue as you with huge stream delay (around 1.5 - 1.6 sec)
My setup is Android device which streams its camera over RTSP using libStreaming, receiving side is Android device using libVlc as media player. Now I found a solution to decrease delay to 250-300 ms. It was achieved by setting up libVlc with following parameters.
mLibvlc = new LibVLC();
mLibvlc.setVout(LibVLC.VOUT_ANDROID_WINDOW);
mLibvlc.setDevHardwareDecoder(LibVLC.DEV_HW_DECODER_AUTOMATIC);
mLibvlc.setHardwareAcceleration(LibVLC.HW_ACCELERATION_DISABLED);
mLibvlc.setNetworkCaching(150);
mLibvlc.setFrameSkip(true);
mLibvlc.setChroma("YV12");
restartPlayer();
private void restartPlayer() {
if (mLibvlc != null) {
try {
mLibvlc.destroy();
mLibvlc.init(this);
} catch (LibVlcException lve) {
throw new IllegalStateException("LibVLC initialisation failed: " + LibVlcUtil.getErrorMsg());
}
}
}
You can play with setNetworkCaching(int networkCaching) to customize a bit delay
Please let me know if it was helpful for you or you found better solution with this or another environment.
I have a website webapp which is highly dependent on jPlayer (audio playback only).
I have different audios and audio live streams on the webapp.
I am using the "preload" option of the jPlayer and has tested it with "metadata" and "auto" option.
On an IOS device I am able to preload the audio and the audio starts playing as soon as I hit play, but on Android Tablets and Mobiles using chrome I am not able to preload the audio as the request gets cancelled.
When I click play the audio then starts buffering and then plays causing a delay in audio playback and time on clicking play.
I tried changing the "solution" of the jPlayer as well from 'html, flash' to 'flash, html'. The wmode option is also set to 'window'.
The same code works fine on IOS and Desktop but causes delay in Android Mobile/Tablet Chrome.
$("#jquery_jplayer_1").jPlayer({
ready: function(event) {
$(this).jPlayer("setMedia", {
wav: c,
mp3: c,
m4a: c,
}).jPlayer('play').jPlayer('stop');
$(this).jPlayer("volume", 0.9);
}, cssSelector: {
mute: '.jp-mute',
unmute: '.jp-unmute',
},
solution: "flash, html",
swfPath: "_/js/_lib/",
supplied: "mp3, m4a, wav, oga",
wmode: "window",
preload: "metadata",
nativeVideoControls: true,
volumechange: function(event) {
if (event.jPlayer.options.muted) {
$(".jp-volume-bar").slider("value", 0);
} else {
$(".jp-volume-bar").slider("value", event.jPlayer.options.volume);
}
}});
Although it is not intended for, you can achieve preloading by using the browser application cache. Create a manifest that declares your audio files url's as part of the 'application'.
https://developer.mozilla.org/en-US/docs/Web/HTML/Using_the_application_cache
http://alistapart.com/article/application-cache-is-a-douchebag
If your audio files are not hosted on your site server, make sure you do not use https: files from other sources do not get cached in this case...
I'm working on an app to stream audio from shoutcast. The streaming goes well on my computer and when debugging it with an Android-configuration.
But when I deploy it to an APK-file and run it on a Samsung Galaxy S, the streaming doesn't work.
My code for streaming the audio:
var urlRequest:URLRequest = new URLRequest("http://.../;");
var soundContext:SoundLoaderContext = new SoundLoaderContext(2000, true);
soundChannel = new SoundChannel();
sound = new Sound(urlRequest, soundContext);
soundChannel = sound.play();
I've also tried an addEventListener(Event.COMPLETE), it seems my app doesn't get there when running on a Samsung Galaxy.
Pls check internet enabled descripted files.
http://help.adobe.com/en_US/air/build/WSfffb011ac560372f-5d0f4f25128cc9cd0cb-7ffc.html
Reffer Also
http://cookbooks.adobe.com/post_Playing_mp3_files_on_Flex_Mobile-19106.html