Android Pie 9.0 not playing the audio - android

I am getting the following errors when I run my app on Android 9.0. It is not playing the mp3 audio on 9.0 but working good on lower versions.
W/AudioTrack( 5492): Use of stream types is deprecated for operations other than volume control
W/AudioTrack( 5492): See the documentation of AudioTrack() for what to use instead with android.media.AudioAttributes to qualify your playback use case
W/MediaPlayer( 5492): Couldn't open http://myDomain/Intro_1543483066.mp3: java.io.FileNotFoundException: No content provider: http://myDomain/Intro_1543483066.mp3

It's maybe when you request AudioFocus : can you share us the code you use ?
Here is a working sample on how to do it on Android > O :
var audioAttributes = new AudioAttributes.Builder()
.SetLegacyStreamType(Stream.Music).Build();
new AudioFocusRequestClass.Builder(AudioFocus.Gain)
.SetAudioAttributes(audioAttributes)
.SetOnAudioFocusChangeListener(this)
.Build();

Related

AudioPlaybackCaptureConfiguration not working in Android BOX (Using HDMI)

I am using google Audio Playback Capture API in ANDROID API Level 29 and its working fine on Android mobile devices i get easily internal audio from the devices but when i test it on the Android TV Box , Basically this app is TV box app where i have to capture internal audio. My Android TV Box connected with HDMI cable but i didn't receive any sound i am using Attributes : USAGE_MEDIA,USAGE_UNKNOWN and USAGE_GAME
they are all working fine in mobile app but not when i run on Android Box. I have done everything from my side now i just need help regarding getting audio from HDMI.
Thanks
val config = AudioPlaybackCaptureConfiguration.Builder(AudioCaptureService.mediaProjection!!)
.addMatchingUsage(AudioAttributes.USAGE_MEDIA)
.addMatchingUsage(AudioAttributes.USAGE_UNKNOWN)
.addMatchingUsage(AudioAttributes.USAGE_GAME)
.build()
val audioFormat = AudioFormat.Builder()
.setEncoding(AudioFormat.ENCODING_PCM_16BIT)
.setSampleRate(buffer.sampleRate)
.setChannelMask(AudioFormat.CHANNEL_IN_MONO)
.build()
val record = AudioRecord.Builder()
.setAudioFormat(audioFormat)
.setAudioPlaybackCaptureConfig(config)
.setBufferSizeInBytes(buffer.size)
.build()

Recorded AAC audio in Android does not play on iOS devices

AAC audio recorded in Android is not playing on iOS devices. Here is the code example of how we are recording audio in Android devices. When we try to play that audio in an iOS device using react-native-sound, it does not play on iOS device and getting an error as below.
Failed to load the sound
and iOS exception read as below.
Objectcode: "ENSOSSTATUSERRORDOMAIN1937337955"domain:
"NSOSStatusErrorDomain"message: "The operation couldn't be completed.
(OSStatus error 1937337955.)"nativeStackIOS: Array[17]userInfo:
Object__proto__: Object
AudioRecorder.prepareRecordingAtPath('/path/to/audio/test.aac', {
SampleRate: 22050,
Channels: 1,
AudioQuality: "Low",
AudioEncoding: "aac",
AudioEncodingBitRate: 32000
});
We are using RN 0.41 and react-native-audio 3.2.1
Did anyone face similar issue? let me know if there is any misconfiguration.
I had the same Error Message. My solution was to add OutputFormat: 'aac_adts'.
AudioRecorder.prepareRecordingAtPath(completePath, {
SampleRate: 22050,
Channels: 1,
AudioQuality: "Low",
AudioEncoding: "aac",
OutputFormat: 'aac_adts'
})
OutputFormat: 'aac_adts' is only needed on Android. It overrides the default OutputFormat: 'mpeg_4'.

How can we distinguish between audio and video under Android audioflinger

we can make distinguish between audio and video if we use android standard api to implement apk to play music/movie. no matter under libaudioflinger or decoder's lib.
when decode audio/video in awesomeplayer.cpp,we can judge the source data't type,audio? or video?
we can make distinguish the app's type under libaudioflinger
use getCallingPid()
Question:
how can we make distinguish 3rd's data source type(Audio?video?)under audioflinger?
yes audioflinger process the pcm data .
However if you want to set some parameters from Application then you can use AudioManager's setParametes API and then have a handling for that parameter in AudioFlinger .
AudioManager am = (AudioManager)context.getSystemService(context.AUDIO_SERVICE);
am.setParameters("key_value_pair");

RTSP 1080p live-streaming android client gets error (100,0)

My new surveillance camera just arrived, so I'm trying to write an app to live stream the video from it.
Since it came with basically no documentation, I installed the 'onvifer' android app which allows you to browse the camera's capabilities. This app works fine - gets the video and allows PTZ controls, etc. It reports the streaming url as:
rtsp://192.1.0.193:554/mpeg4
I tested the stream in the VLC windows client, and it's able to stream video from that URL as well. This makes me comfortable that the network is working OK.
The camera states the feed will be 1920x1080; VLC confirms this.
The basic code in my activity:
VideoView videoView = (VideoView)this.findViewById(R.id.VideoView);
videoView.setVideoURI(Uri.parse("rtsp://192.1.0.193:554/mpeg4"));
videoView.requestFocus();
videoView.start();
I've also given the app INTERNET permissions in AndroidManifest.xml, disabled authentication on the camera, and am running on a real device (not the emulator).
When I run the app, LogCat shows this immediately:
setDataSource IOException happend :
java.io.FileNotFoundException: No content provider: rtsp://192.1.0.193:554/mpeg4
at android.content.ContentResolver.openTypedAssetFileDescriptor (ContentResolver.java).
About 15 seconds later, the app shows a "Can't play this video" modal dialog box and this is added to LogCat:
MediaPlayer error (100, 0)
AudioSystem AudioFlinger server died!
MediaPlayer error (100, 0)
VideoView Error: 100,0
I've googled everything I can think of, but haven't found anything useful.
Any thoughts?
wild-ass-guess on your logcat and the RC=100... No SDP file or no equivalent for RTSP of the 'moov atom' block required to negotiate details of the stream /container/ codec/ format... You can get the AOSP code for mediaPlayer/videoView and grep the RC value in the source.
RTSP is gnarly to debug ( note the tools links ) and not assured to run inside a NAT'd network due to UDP issues. So, to get better result, you may have to look into forcing your config to do data channel on TCP an not UDP. Or it could be other issues , of which there are many.
If you really want to investigate, some possible tools below:
Use command line and CURL client to request your stream:
Android - Java RTSP Session Mgmt package on Git
Protocol dumps for CLI RTSP sessions to Youtube RTSP/SDP streams
To pursue the issue, you may need to get into the weeds with debug tools that track details of the protocol negotiation that preceeds the MediaPlayer actually starting play on the stream. That would include learning the RFP and the protocol details.
videoView.setVideoURI(“rtsp://192.1.0.193:554/mpeg4”);
Try your app on another phone.
You may find the problem is about the mobile device.
Try this
path:"rtsp://218.204.223.237:554/mobile/1/4C024DFE77DC717D/onnuvesj43xj7t26.sdp".
See whether the code has something wrong.

Recording voice calls in android

I want to make an app that records incoming and outgoing calls. I am using the media recorder to do so and also I am using service/broadcastreceiver to detect phone state change. I have set the audio source as Audiosource.VOICE_CALLS.I am able to record voices at my end but not from the other end. The same happens when the audio source is set as Audiosource.MIC.
Please suggest a solution.
You cannot access the incall audio stream in Android using the public SDK. Call audio is not exposed to apps, and you can only do this if you modify Android at a source level, and then build and install a new image from the customized source to your device.

Categories

Resources