Call Audio Stream Modification in Android 4.0 ICS - android

I've been working on a project that would greatly benefit from call-stream modification. This has been repeatedly said/assumed to be unachievable, as most people believe that the hardware loop for the in-call audio is completely disconnected from the main MCU of the device.
Questions like Stream audio to a phone call Android have received answers stating that it is impossible to access the audio. I agree that this is definitely impossible from the Android API, but it is completely unclear whether the hardware ACTUALLY is disconnected completely.
The stackoverflow user 'artsylar' said that they were able to modify the 'framework layer' of Android OS to inject recorded audio into call streams, which would be a huge step forward (see Play an audio clip onto an ongoing call, artsylar's comment on the selected answer). Assuming artsylar's success is valid, there definitely is a way to control the call stream audio by modifying the framework (I assume the telephony base framework in the Android source).
Basically, I completely agree that modifying or controlling the call-stream is impossible from the application layer. However, I am interested in customizing the Android OS in framework or Radio Interface Layer; artsylar seems to have had success, but there is no explanation in open-literature on how. Given the current state of Android technology, could anyone clarify the above to actually establish whether controlling call audio is possible by modifying the core Android OS, and a good path to accomplish this goal?
I believe that a final clarification on this issue would be of great value to the open-source community.
Thanks!

It's technically possible to inject audio into the voice call uplink on some platforms (Qualcomm's MSM8960 and APQ8064, for example). Support exists at the hardware level and at the device driver level. But to make that functionality available to normal applications you'd have to create a custom Android ROM where you've added all the necessary user-space parts in both the Java layers and native layers of Android.
So the short answer is: no, there's no standard way of doing this as an app developer (doesn't matter if you use the SDK or NDK).
If you're working for an OEM or by some other means are able to build and flash your own Android ROMs you can probably get the information you need by asking your platform vendor.

It is very difficult to do so because it relates to handling the Linux Kernal inside the Android OS.
Not only is there no API support , but also the security issue is not allowed to do so.
As being a professional in the software engineering field especially the programmers, we
never assume anyone's success on invention and the related project is valid until the project is being tested.
Also streaming the audio during the call may invoke the issue of privacy and security issue among the smartphone users and the service provider of telephony

Related

Android as an UVC Camera

I'm stuck at home with a rather bad webcam. I was considering upgrading, but then it struck me: phones these days have really good cameras embedded in them. So why not use it as a webcam?
However, as I was researching this further I was really disappointed with the available apps for this. As far as I was able to find, we have Android apps that work roughly as follows:
Present phone camera as a network attached camera. Then you can use local software to use that feed as a webcam. See e.g., IP Webcam. This may be sufficient, but it's a complicated setup, and network latency makes this far from ideal.
The Android app sends the camera feed to an custom host application that in turn creates a virtual web camera. See e.g., DroidCam. This mostly solves the latency problem, but it is still rather complicated, and requiring us to install a specific third party application is troublesome in regard to user privacy. Especially since the applications are closed source.
So, I took the engineering approach and tried to see if it was even possible to improve the situation. As far as I was able to find, Android supports being used as a custom USB accessory. And looking over the USB video class documentation, it strikes me that it should be possible to create an Android app that presents the phone as a generic UVC webcam, such that we do not have to resort to tricks such as the ones above.
Ideally, I would have liked Android to add another USB device option ("Use USB connection as webcam") in addition to debug mode, file-transfer, etc. This seems quite unlikely to happen in the short term however.
So, my question is this: Does an application that does the above already exist? My searching thus far haven't yielded any results, but I might be missing something as googling for this turned out a bit harder than I expected.
Alternatively, am I wrong in my assumption above, such that there is some fundamental issue why an Android application cannot be made to work in that way?
There does not seem to be any complete app yet as of 2020-10, but the parts are mostly there:
https://github.com/tejado/android-usb-gadget has code to switch the Android device into gadget mode (but no UVC yet)
https://git.ideasonboard.org/uvc-gadget.git feeds v4l2 into the uvc gadget output
Sources:
http://www.davidhunt.ie/raspberry-pi-zero-with-pi-camera-as-usb-webcam/
https://www.raspberrypi.org/forums/viewtopic.php?t=148361
https://www.reddit.com/r/androiddev/comments/iabc2o/can_i_use_my_android_as_wired_camera_ie_as_a/g1nrijl/
It appears Google has started to take notice on this issue and are currently working on a "DeviceAsWebcam" service, which is exactly the solution to this problem, as seen in the Android review below:
https://android-review.googlesource.com/c/platform/system/sepolicy/+/2410788
Naturally though, this is a Android 14 feature, so it will like take a while before this is usable on a lot of devices. Hopefully, someone is able to backport this feature to older versions of Android.
If android / the version of Android that comes on your target phone provides / permits use of the USB gadget driver, then libguvc,
https://developer.ridgerun.com/wiki/index.php?title=USB_Video_Class_Gadget_Library_-_libguvc
can be used to "make an application appear as a USB webcam".
Potentially relevant to get you started would be https://stackoverflow.com/search?q=Android+USB+gadget (other SO references to the use of the USB gadget driver on Android).

Raw ECG signals from QardioCore via bluetooth (Android)

Is it possible to retrieve raw ECG signals programmatically from QardioCore via bluetooth?
I only have an Android device, and as the Android Qardio app doesn't work for QardioCore, I wanted to know if anyone already tried to write an own app for usage with Android, and if it worked?
The manufacturer wrote me in an email:
Qardio Core will be compatible with iOS only and we are focused on providing
a smooth experience on Apple devices that use iOS 10.0 or later.
[...] there are no immediate plans to bring QardioCore to the Android
platform [...]
This apparently means that there are no SDKs as well.
I have also emailed the manufacturer and got a similar answer:
[...]
Qardio Core will only be compatible with iOS for the foreseeable
future. We strive to provide the best experience and have not been
able to guarantee flawless operation on the multitude of Android
devices.
An SDK is also not available at this time.
[...]

Why in Bluez 5.35 SBC codec capabilities are initialized in android/hal-audio-sbc.c packge not in AVDTP.c

I am updating Bluez 4.97 to 5.35 in my embedded device.
For A2DP connection, we have to share SBC codec capabilities. In ideal case the capablity will look like figure 1 . In Bluez 4.97 code, I am getting SBC codec capability from sbc_getcap_ind() function in AVDTP layer. In sbc_getcap_ind(), both sbc_codec_cap and avdtp_media_codec_capability are initialized. So this capability packet I can send back to Phone.
In 5.35, sbc_getcap_ind() function is not available. avdtp_media_codec_capability are set in endpoint_getcap_ind() function in AVDTP layer, which is as per my expectation. But sbc_codec_cap is not initialized. So I am getting packect like in figure2.
In blueZ 5.35 there comes the new package android/hal-audio-sbc.c, in this package SBC coded capability are set.
My embedded device is RTOS based and I have nothing to do with android. So I have following doubt:
1) Why there is new android package in blueZ stack? What's the development idea behind this?
2) Why SBC capabilities are initialized in android/hal-audio-sbc.c, how non-android device will access SBC capabilities?
3) How in my embedded environment, I can use android/hal-audio-sbc.c to get SBC capabilities?
I think I am not able to resolve this issue because I am missing understanding of new 5.35 architecture. And there are not enough documents to understand BlueZ architecture. I hope by getting answer of these question I can understand significance of android folder in 5.35 BlueZ package.
Before answering your questions, I would like to share couple of URL's.
Porting guide
Management interface
Coming for your questions,
BlueZ now supports both android and Linux platform. The directory "android" contains only sources related to android platform, which can't be used for Linux environment. The idea behind this is to share common code of development between Linux and android, and develop common functionalities separately (mostly under "src", "gdbus" and "profile" directories).
As part of BlueZ 4 to BlueZ 5 migration or major development, all the audio related implementations are moved out of BlueZ. Now it's the responsibility of the Audio application to implement the whole stuff on it's own and register with BlueZ (doc/profile-api.txt ==> RegisterProfile() method). BlueZ will only act as the mediator between your application and Devices. As far as Linux is concerned, there is no audio implementation inside BlueZ. Am not sure about Android directory under BlueZ. So non-android platforms needs to implement on it own.
As mentioned, you need to implement our own audio related profiles for BlueZ. We have one working software, which is pulseaudio. You can load the module-bluez-discover in pulseaudio (pactl) and Pulse audio takes care of audio.
There is also another solution available in open source, bluealsa which is currently under active development. After using it, I could see lot of delay in audio and less quality. If you want perfect solution, implement on your own or use pulseaudio (no so much real time).
In simple words, migration application from BlueZ 4.x to BlueZ 5.x is not easy!

Qt mobile video call streaming

I’m completely new to Qt mobile, I even don’t have a solid mobile dev experience, so sorry if I’m asking something obvious.
I need to develop a mobile app that should have the ability to receive a call like functionality (over internet, not GSM call). When answered, it should start streaming audio and video from our server. The call should be one way only, meaning, that stream goes from server to device, but never from device to server.
So my questions is:
Is this possible in Qt? I chose Qt because I’ familiar with it and I want to support desktop, android and ios. maybe windows phone later.
Is it possible to receive a call when the screen is shut off and my app is not running? I mean, this is a mobile device, the app won't be running all the time, it should be started only when a call is made from server to device. How can I achieve that? I think Viber, Skype and other messaging apps do that.
Many thanks in advance :)
1 - Well, sure it can, although it might not be as easy and straightforward as you'd want. Qt Multimedia does provide the necessary classes, but you do have to check how supported they are on the platforms you need to target.
However, the classes Qt provides are either too high level to serve any purpose but their intended purpose, or too low level, so you must implement pretty much everything by hand. In this aspect, the benefit of Qt being capable of producing portable apps may not outweigh the ease of using certain platform specific libraries that offer video streaming out of the box. In other words, it might be easier to write separate Android and iOS apps using Android and iOS libraries than a single Qt app that will work on both.
But just in case you decide to go with Qt, as I mentioned for the time being you are left with one option - do most of the work yourself. This means you should record audio using QAudioRecorder and capture frames periodically from a QCamera in a buffer of given length, compress that data (and preferably encrypt it if security is a concern), send it to the client over a QTcpSocket connection, decompress (and decrypt) the data and play it back in sync. It is certainly doable, but as already mentioned, it will be much harder since Android and iOS libraries offer pretty much "out of the box" solutions. Alternatively, you might decide to use a third party solution that offers support for all the platforms you target.
2 - whether your screen is on or off - that will be a call to a platform specific API, so are requests to turn it on or to keep in on for a given duration. Whether your app is running on the device or not, that is easy - just try a TCP connection with the client on the device, if it succeeds then the client is running. If you want to receive calls while your app is not running, you will have to implement a platform specific service that runs all the time instead and starts your application when a call is received.
QT Mobility does not have a a framework for supporting VoIP as you can see from the reference :
http://doc-snapshot.qt-project.org/qt-mobility/
You could create the VoIP framework of your app natively (which is going to require a good understanding of the various audio and video frameworks available) but another way to go
would be to use a VoIP SDK that supports both iOS and Android such as the Twilio mobile client
https://www.twilio.com/client/mobile
Qt mobile will help you in your application's UI, however you will have to write some native code for each platform you are going to use. Note that Qt is extending fast, you might need not to get your hands dirty with platform specific native code in upcoming versions of Qt.
Yes, you can receive a VoIP call when your application is closed by creating a background service (but as I know so far Qt doesn't do the job for you, you'll need to do it natively), it is the way Skype and Viber work.
As per I know new Blackberry10 OS using qt for developing. There is one source code available about VoiP Calling in qt. I am still searching about video call.
Check below link, May be helpful
1) Blackberry Developer Blog
(2) PjSip Blog
(3)Download Source Code
I don't know how to develop app in Android, ios, desktop using qt language.
But I am suggesting develop app in all native language instead qt.

Low-level communication on Android/other mobile platforms

Android has it's NDK to handle native code. I was just wondering how deep you can go with low level programming on this platform - whether you can for example control what the phone transmits through the GSM/UMTS network or see the raw data that are received.
Is working with the embedded phone modem generally possible on this/other platforms within the scope of their APIs?
I presume that interfering with the lower communication layers (like adding compression to the voice data, change of encoding and so on) means playing with the firmware of the communication modules and is generally something that the phone manufacturers don't support officially.
Anyway is there something about this topics you can recommend to read or search for?
A public API package (it's not really rich in terms of your requirements):
http://developer.android.com/reference/android/telephony/package-summary.html
About the low-level APIs (the Radio Interface Layer of Android), you can read here:
http://www.netmite.com/android/mydroid/development/pdk/docs/telephony.html
About implementing (modifying) the RIL:
http://www.netmite.com/android/mydroid/development/pdk/docs/telephony.html#androidTelephonyRILImplementing
In this case, however, you need to touch the Android source code, so for application development, it is not an option.

Categories

Resources