Adobe AIR for mobile: Using Bluetooth audio as "Microphone" - android

I'm developing an AIR for Android application, and am current sending audio to fms servers via standard NetStream/Microphone options. I (ignorantly) assumed that attaching a bluetooth device would be pretty simple, and connecting it would make it show up as a native "Microphone". Unfortunately, it does not.
I don't think it is even possible to use Netstream.publish and publish raw bytes, so the only hope is that there's a way to use NativeProcess + Java to create a native microphone "handle" that AIR can pick up on.
Has anyone run into this issue?

I think one possible solution would be using NetConnection.send() instead of Netstream.publish().
You should get sound data from your BT microphone. I am not sure if you can get using AIR. You may need to use an android service that gets the sound data and feeds your AIR app via a file, a UDP port or an invoke etc.
When you get some sound data, encode it so flash can play it (Speex, Nellymoiser, etc) You can do the encoding in your Android service as well.
Whenever your AIR app receives a sound data, send it to your streaming server via NetConnection.Send().
Extend your streaming server to process sound data received. You can embed it into a flv stream, or send to other flash clients if it is a chat app.
Other than that, I can't find a way to have a "microphone handle" for your BT microphone. I once thought of creating a virtual device on Android, but I couldn't find any solution.

Related

Android sending message to BLE A2DP device

I have a device (ESP32) that has a Bluetooth module. I found some library to use on it and it acts as A2DP sink, works perfectly with Spotify app. My objective is to be able to send some small packages of data to it from my Android phone (device name, notifications, etc) while Spotify app is streaming media to it. What would be the best way of doing that? I tried to approach this with MediaSession.setMetadata method, but it seems it only works with actual media playback.

Android Live video steaming from one device to another

I am building an android application which will stream video calling from one device to another android device for that i am using wowza video streaming API(Media engine) With that i am able to steam video from android app to web but is it possible device to device video steaming?
If you are planning to develop all the infrastructure , then these are the points to be evaluated and concluded.
What Technology is used
WebRTC is the technology used to support video calling. WebRTC is a free, open project that provides browsers and mobile applications with Real-Time Communications (RTC) capabilities via APIs. Check out WebRTC Details here It was introduced by Google in 2010. This allows real time communication between two browsers / mobiles.
Concepts involved
1. Data streams and Hardware
WebRTC helps into setting up/identifying hardware’s and identifying network with STUN server ( What is STUN server ) along with hardware ( microphone/camera and speakers) . For mobiles this comes as inbuilt hardwares
2. Audio Video CODECS
Google has made audio/video required for these features as open source. Generally Audio G711 for phones (still varies in specific cases) . And for Video VP8 and VP9
3. Peer Discovery
For making a call , generally either address is required. Now in internet most IP’s are dynamic. To solve this , server needs to keep track of who is online. This can be done using XMPP, SIP or some custom protocols. So for anyone to receive a call, the caller should check with server or the other way around
4. STUN Server
Once signalling (peer discovery) is done, then STUN server is required. This server will faciliate to determine external IP address as well as info whether two or more devices can talk to each other or not
5. TURN Server
If a peer-to-peer session is not possible , then a TURN server is required. The TURN server will basically shift the bits for you through open holes in the firewall between the two clients. This happens due to asymmetric firewalls and the possibility to punch holes on different ports in firewalls
Or else you can use providers like SINCH who already handles and configures the basic requirement and you only need to concentrate on mobile front end.
Check out SINCH ANDROID SAMPLE as well

Chrome web audio api - how to play sound in a phone call

I cant find anything online but how can i use a chrome tab web audio api in an android app so i can play sound during a phone call.
i went to this site but when i play the sound during a phone call the far end doens't here anything. I thought one feature of web audio was that it can play change the sound of someones voice in a phone call, so i thought it had access to the audio phone call stream.
even here the tech says its ready for android but i cant even get hte audio recorder demo to work on android.
While you do (with the user permission) have access to the input of the device you only have access to the main output of the device (internal speakers or headphones). This is represented as the AudioContext.destination. The buffers in a call is (probably) a different output that you simply don't have access to in Web Audio (and that's probably a good thing. Imagine the security issues we'd have if apps were allowed to hijack calls!).

stream voice through wifi - android

I'm working on entry phone. The entry phone is sending to me voice and video through rtsp protocol so I can simply get voice and video from camera on my device. But I have no idea how to send a voice to that device. Is there anything that would help me send and receive audio in the same time (something like a call)?
If I understand correct you want video calls. Just like sipdroid app. It is an open source project look at VideoCamera.java class in this project.
http://code.google.com/p/sipdroid/

initiating a rtsp connection over cellular with android

The majority of sim accounts are public dynamic. Most if not all cellular providers do not allow incoming connections to public dynamic ip addresses. (3g anyway, maybe not 4g/LTE)
The issue of connecting is not one of dynamic ips, but rather blocked incoming ports.
So, if I wanted to stream video from an android phone on demand (based on information gleaned from this conversation (Streaming video from Android camera to server)), what would be the chain of events to properly intitiate a connection.
My idea of this (roughly):
app on android phone initiates and keeps open some sort of connection to media server (wowza or something).
At some point when server wants video from phone, it uses the open connection to request a video stream.
Android phone pushes rtsp stream to server.
Is this correct, and if so, what type of connection should i use as the permanent control connection. Also, is it possible to push rtsp or would i have to do something else?
Thanks!
I know this is an old question but if anybody else is searching for something similar the following is now available:
http://developer.android.com/guide/google/gcm/index.html
This essentially allows a message to be sent from a server to an app on an Android device (it replaces C2DM which did a similar thing).
Update
Google GCM has now being replaced in turn by Google Firebase Cloud Messaging:
https://firebase.google.com/docs/cloud-messaging/
Using a could based app messaging service like this, the steps would be:
Add a message subscription service to your app (e.g. Firebase)
The App registers with the cloud messaging service when it starts up
When the server wants video from the phone (as noted in the questions above) the server sends a message to the app
The app opens a connections to the streaming server and starts to stream video to the server.
Note: there is a comment below about how this approach does not allow an incoming connection from the server to the Android phone.
This, in fact, is not how streaming from a phone typically works. The phone actually makes an 'outgoing' connection to a streaming server which it then streams the video to. Other devices wanting to see the video then stream it form here.
There are several reasons why this is the preferred approach, one of the key ones being that supporting a quality streaming service that will play back on most common devices, browsers, OS's etc requires transcoding the video into multiple bit rates, and even encodings in some cases, and packaging and serving in the appropriate streaming packaging format. Doing all this on the mobile device would be very compute and storage intensive.

Categories

Resources