I am currently working on building a System app for android and what the app basically does is running some audio processing on Call Audio. I'm using the Azure Speech SDK for speech transcription. However I'm facing issues while using the AudioRecord API for accessing the audio stream. I'm aware of the security restrictions on android regarding call audio access, and the fact that only System apps are now eligible to do that.
Problems faced -
I want to use the VOICE_CALL audio source but using this source is throwing error Invalid capture preset 4 for AudioAttributes
Digging further I found that the AudioAttributes.java class is preventing the access to this given audio source. Reference - Android Recording Call parameters
Using the MediaRecorder API for the same is also giving error java.lang.RuntimeException: start failed.
The Setup -
Rooted Android device [Motorola G4 plus and Realme Narzo 20A]
App converted to System app by placing the APK inside the system/app directory (Alternatively also tried out placing inside system/priv-app)
Permissions granted -
<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
<uses-permission android:name="android.permission.CAPTURE_AUDIO_OUTPUT" />
<uses-permission android:name="android.permission.READ_CALL_LOG" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.READ_PHONE_STATE" />
<uses-permission android:name="android.permission.READ_CONTACTS" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.CALL_PHONE" />
Code -
You can find the code in this repository : https://github.com/ashishpatel16/RealtimeCallTranscription (It was made for reporting problems in the Azure speech SDK but the setup is exactly the same)
CustomAudioStream.java
public class CustomAudioStream extends PullAudioInputStreamCallback {
private final static int SAMPLE_RATE = 16000;
private final AudioStreamFormat format;
private AudioRecord recorder;
private static final String TAG = "CustomAudioStream";
MediaRecorder mRecorder;
private Context context;
public CustomAudioStream(Context context) {
this.format = AudioStreamFormat.getWaveFormatPCM(SAMPLE_RATE, (short)16, (short)1);
this.context = context;
this.initMic();
}
public AudioStreamFormat getFormat() {
return this.format;
}
#Override
public int read(byte[] bytes) {
long ret = this.recorder.read(bytes, 0, bytes.length);
return (int)ret;
}
#Override
public void close() {
this.recorder.release();
this.recorder = null;
}
private void initMic() {
AudioFormat audioFormat = new AudioFormat.Builder()
.setEncoding(AudioFormat.ENCODING_PCM_16BIT)
.setSampleRate(SAMPLE_RATE)
.setChannelMask(AudioFormat.CHANNEL_IN_MONO)
.build();
this.recorder = new AudioRecord.Builder()
.setAudioFormat(audioFormat)
.setAudioSource(MediaRecorder.AudioSource.VOICE_CALL)
.setBufferSizeInBytes(2048)
.build();
this.recorder.startRecording();
}
}
Questions -
Is there any alternate method to access call audio stream on rooted device?
I could technically build a custom android AOSP from source and modify the AudioAttributes.java class and flash it to the android device but that's a lot of effort. Do I have to do this? Can I achieve this in an easier way?
I am skeptical whether the CAPTURE_AUDIO_OUTPUT permission is granted or not. Are there any more steps involved in gaining these permissions on rooted device?
Related
I'm trying to create C++ Android native camera wrapper using the NDK camera2 API (from abi level 24). I created some snippet code using an example I found and compile it for target API level 24 and run it on Android 7.1 phone:
ACameraManager *cameraManager = ACameraManager_create();
VB(cameraManager!=nullptr, "Could not create CameraManager.");
camera_status = ACameraManager_getCameraIdList(cameraManager, &m_camera_id_list);
if (camera_status != ACAMERA_OK) {
LOGE("Failed to get camera id list (reason: %d)\n", camera_status);
return ERR_CAMERAAPI_UNKNOWN_ERROR;
}
if (m_camera_id_list->numCameras < 1) {
LOGE("No camera device detected.\n");
return ERR_CAMERAAPI_UNKNOWN_ERROR;
}
When I run this naive code on Xiaomi mi4c Android 7.1 phone I get an empty camera list.
I also tried to run on the same phone a snippet created with Java camera2 API that does the same thing:
import android.hardware.camera2.CameraDevice;
Activity activity = getActivity();
CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
String[] cameraIds = manager.getCameraIdList()
manager.openCamera(cameraIds[0], mStateCallback, mBackgroundHandler);
This time I see in the logical that it actually finds two cameras and print their resolutions.
My manifest of course contains these lines:
<uses-sdk android:minSdkVersion="24" />
<uses-feature android:name="android.hardware.camera2" android:required="true" />
<uses-feature android:name="android.hardware.sensor.gyroscope" android:required="false" />
<uses-permission android:name="android.permission.CAMERA"/>
And I approve the permissions requests.
Does anyone knows why it finds the phone cameras when using the Java camera2 API but does not find them when using the NDK camera2 API?
The NDK camera2 support does not work if
CameraCharacteristics.get(INFO_SUPPORTED_HARDWARE_LEVEL) == INFO_SUPPORTED_HARDWARE_LEVEL_LEGACY
This is probably the case of Xiaomi mi4c.
I'm trying to show WebRTC chat in WebView.
According to this documentation, WebView v36 supports WebRTC. For my test I'm using a device with Chrome/39.0.0.0 and I have added permissions to the AndroidManifest.xml file:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<user-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
but when I enter into the chat, I see a Chromium error log (device doesn't show \ translate anything, only 'loading' progress bar):
W/AudioManagerAndroid: Requires MODIFY_AUDIO_SETTINGS and RECORD_AUDIO
W/AudioManagerAndroid: No audio device will be available for recording
E/chromium: [ERROR:web_contents_delegate.cc(178)] WebContentsDelegate::CheckMediaAccessPermission: Not supported.
E/chromium: [ERROR:web_contents_delegate.cc(178)] WebContentsDelegate::CheckMediaAccessPermission: Not supported.
W/AudioManagerAndroid: Requires MODIFY_AUDIO_SETTINGS and RECORD_AUDIO
W/AudioManagerAndroid: No audio device will be available for recording
D/ChromiumCameraInfo: Camera enumerated: front
Tested on a real device, Android 5.1.1.
additional request for permissions is needed
webView.setWebChromeClient(new WebChromeClient(){
#TargetApi(Build.VERSION_CODES.LOLLIPOP)
#Override
public void onPermissionRequest(final PermissionRequest request) {
request.grant(request.getResources());
}
});
update but it not working for audio capture
UPDATE found working google-sample code here
You need these permissions to access Camera and Microphone
<uses-permission android:name="android.permission.CAMERA"/>
<uses-feature android:name="android.hardware.camera" android:required="true"/>
<uses-feature android:name="android.hardware.camera.autofocus"/>
<uses-permission android:name="android.permission.RECORD_AUDIO" />
// don't miss this one
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
Next you need to grant permissions to your webview, check this link for more details:
webView.setWebChromeClient(new WebChromeClient(){
#Override
public void onPermissionRequest(PermissionRequest request) {
runOnUiThread(() -> {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
String[] PERMISSIONS = {
PermissionRequest.RESOURCE_AUDIO_CAPTURE,
PermissionRequest.RESOURCE_VIDEO_CAPTURE
};
request.grant(PERMISSIONS);
}
});
}
});
If audio playback is not working, use this:
webView.getSettings().setMediaPlaybackRequiresUserGesture(false);
My experience with this in 2022:
CAMERA and RECORD_AUDIO permissions need to be declared in Manifest
setWebChromeClient.onPermissionRequest should check if those permissions have already been granted. If not, use registerForActivityResult(new RequestMultiplePermissions()) to ask the user to grant them.
its mostly error in webview reload becuase when we will request for audio , camera permission on webview , after accept permission , we need to refresh the webpage.
if (permission.equals("android.webkit.resource.AUDIO_CAPTURE")) {
demandForPermission(request.getOrigin().toString(), Manifest.permission.RECORD_AUDIO, MY_PERMISSIONS_REQUEST_RECORD_AUDIO);
} else {
myRequest.grant(request.getResources());
}
I also stuck this problem for many days but after in below link code , 100% working code Android Webview
public class AudioRecorderActivity extends Activity {
private static final int RECORDER_SAMPLERATE = 8000;
private static final int RECORDER_CHANNELS = AudioFormat.CHANNEL_IN_MONO;
private static final int RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT;
private AudioRecord recorder = null;
private static final String TAG = "AudioRecorderActivity";
short[][] buffers = new short[256][160];
int ix = 0;
private boolean stopped = false;
private void startRecording() {
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
try {
int N = AudioRecord.getMinBufferSize (
RECORDER_SAMPLERATE,
RECORDER_CHANNELS,
RECORDER_AUDIO_ENCODING) * 20;
recorder = new AudioRecord(AudioSource.MIC,
RECORDER_SAMPLERATE,
RECORDER_CHANNELS,
RECORDER_AUDIO_ENCODING,N );
recorder.startRecording();
while(!stopped) {
short[] buffer = buffers[ix++ % buffers.length];
N = recorder.read(buffer,0,buffer.length);
}
}
catch(Throwable x) {
Log.v(TAG,"Error reading voice audio",x);
x.printStackTrace();
}
finally {
stopped = true;
stopRecording();
}
}
}
Question : Though the code snippet is based on an exaple from StackOverflow, it is not working
Please let me know what could be the mistake ?
Here is the Error Message
12-20 03:44:32.271: E/AudioRecord(224): AudioFlinger could not create record track, status: -1
12-20 03:44:32.271: E/AudioRecord-JNI(224): Error creating AudioRecord instance: initialization check failed.
12-20 03:44:32.271: E/AudioRecord-Java(224): [ android.media.AudioRecord ] Error code -20 when initializing native AudioRecord object.
Add record_audio permission to AndroidManifest.xml
<uses-permission android:name="android.permission.RECORD_AUDIO" />
On recent Android versions, it seems that repushing the apk from Android Studio after changing the permissions in the manifest doesn't actually change the permission in the app settings. This is why the above answer worked for me. I had built an application without the RECORD_AUDIO permission in my manifest, and seen the permission error in my log (along with the errors in the OP). I added the permission to my manifest and reran the application on device, the permission error in the log was no longer there but I still got the errors in the OP. I went to my System, Applications, Application Manager -> MyApp -> Permissions, and saw that Record Audio was listed but still turned off. Manually turning that on fixed it.
I imagine (but haven't confirmed) that if I built the correct permission into my manifest in the first place, this would have just worked for me. Alternatively, uninstalling my app from the device, and then reinstalling, might fix too, but I haven't tried that either.
Solution for Pre-Marshmallow versions of android will be same as Olexij mentioned...
<uses-permission android:name="android.permission.RECORD_AUDIO" />
However as starting Android API 23 (Marshmallow) this is considered and dangerous permission and you have to request for this permission from within the Activity where it is being used.
Dave's experience is result of that; if you declare permission in Manifest but do not request at runtime (atleast first time) then it will show in App Permissions in Application Manager but will stay turned off.
Check...
Requesting Permissions at Run Time
ActivityCompat | Android
Developers
The solution for API 23 (Marshmallow) is simple. You must set "targetSdkVersion 22" in the bulid.gradle file, after get bellow permission.
Giving Explicit Permission fixed the issue for me.
Try these steps
Android Version 7.0
Settings -> Applications -> <MyAPP> -> Permissions
-> [Turn Camera and Microphone ON]
I am trying to implemented the VoIP application using the AudioGroup and AudioStream classes of the android.net.rtp package. But my application not function properly. After "Join" the "AudioGroup" class object with the "AudioStream" object, its send udp packets successfully. I checked that using the packet analyzer. But voice is not hear from the phone. I run my application in 2 phones and try communicate voice between them.
In below I mention my source code.
public class MainActivity extends Activity {
private AudioStream audioStream;
private AudioGroup audioGroup;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
try {
audioGroup = new AudioGroup();
audioGroup.setMode(AudioGroup.MODE_NORMAL);
audioStream = new AudioStream(InetAddress.getByAddress(new byte[] {(byte)192, (byte)168, (byte)1, (byte)4 }));
audioStream.setCodec(AudioCodec.PCMU);
audioStream.setMode(RtpStream.MODE_NORMAL);
audioStream.associate(InetAddress.getByAddress(new byte[] {(byte)192, (byte)168, (byte)1, (byte)2 }), 5004);
audioStream.join(audioGroup);
AudioManager Audio = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
Audio.setMode(AudioManager.MODE_IN_COMMUNICATION);
}
catch (SocketException e) { e.printStackTrace();}
catch (UnknownHostException e) { e.printStackTrace();}
catch (Exception ex) { ex.printStackTrace();}
}
I set this permissions in the Manifestfile.
<uses-permission android:name="android.permission.USE_SIP" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.VIBRATE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.WAKE_LOCK" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-feature android:name="android.hardware.sip.voip" android:required="true" />
<uses-feature android:name="android.hardware.wifi" android:required="true" />
<uses-feature android:name="android.hardware.microphone" android:required="true" />
I am using the Samsung GALAXY S3 phone with Android 4.0 OS
The trick is to get the port mapping correct. You need to use the port number from audioStream.getLocalPort() and send this port number to the peer in the SDP packet as SIP signalling.
Check out this example application which implements sip functionality
https://github.com/Mobicents/restcomm-android-sdk/tree/master/Examples/JAIN%20SIP
I used the same code you submitted, and got it working with minor changes. Basically i found the problem was getting the port number correct.
When creating the audioStream the port number seems to be random. At Android developer I found: Note that the local port is assigned automatically to conform with RFC 3550.
What I did was I started the application on one phone first and used audioStream.getLocalPort() to find the port number. Then I connected to this port using the other one. This resulted in two-way communication, even if i only had the correct port number on one phone.
Hope this helps.
I think you should set the speaker on!
Maybe you can use the following method:
audioManager.setSpeakerphoneOn(true);
I am using AndroidFX Visualizer class in my demo app to read FFT but when i try to create object of that class its throwing Runtime exception (java.lang.RuntimeException: Cannot initialize Visualizer engine, error: -1). Player class is my custom class for playback control and using same Player class i have implemented equalizer class and that's working fine. Do i need to add any permission in manifest file?
Player mediaPlayer = Player.GetInstance();
mediaPlayer.LoadFile("song.mp3");
mediaPlayer.Play();
try{
visual = new Visualizer(mediaPlayer.GetAudioSessionID()); // this line causing Exception
visual.setEnabled(true);
visual.setCaptureSize(Visualizer.getCaptureSizeRange()[1]);
}
catch(Exception ex)
{
Log.e("Visual Ex", ex.getMessage());
}
That was due to my foolish mistake, that feature requires <uses-permission android:name="android.permission.RECORD_AUDIO"></uses-permission> permission. thanks
I know this is a very late answer but I also struggled with this problem and I want to share my experiences.
First, as the answer above mentioned, the permissions
<uses-permission android:name="android.permission.RECORD_AUDIO"/>
and, if audio source 0 is used (Visualizer(0); //system mix),
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS"/>
are needed. After adding the permissions to my app and installing the (new compiled) app again, my app still crashed. I found out, that the device has to be restarted, to use the Visualizer without any exception (for whatever reason). So if you develop an app and get this exception, a restart could be required after adding the permissions to the app .