I want to make android media player using ffmpeg.
(catch MPEG2 TS multicast stream via WIFI network and decode it)
I checked followings:
My iptime AP supports WIFI multicast protocol.
(send multicast stream in wired PC, and wifi connected PC can receive it)
My Android phone can receive multicast stream via WIFI.
I coded NDK socket programming which is join udp multicast group and receive packets
(I added multicast access grant to AndroidManifest.xml)
FFMPEG library is ported to android and it can play local media file.
But when I try to open network stream using FFMPEG library, avformat_open_input() function returns fail.
gFormatCtx = avformat_alloc_context();
av_register_all();
avcodec_register_all();
avformat_network_init();
if(avformat_open_input(&gFormatCtx,"udp://#239.100.100.100:4000",NULL,NULL) != 0)
return -2;
this code always return "-2".
If I use "av_dict_set()" api, which option should I use?
av_dict_set(&options, "udp_multicast", "mpegtsraw", 0);
please let me know what should I check for avformat_open_input error?
thanks.
Related
I have a headphone that connect to my phone via type-c. It has USB DAC in it, connected to type-c with USB protocol.
To control its sidetone volume, I do:
UsbInterface usbInterface = theDevice.getInterface(0);
UsbDeviceConnection connection = usbManager.openDevice(theDevice);
connection.claimInterface(usbInterface, true);
byte[] buffer = intTo2Bytes(volume);
connection.controlTransfer(0x21,0x01,0x0100,0x0200,new byte[]{0x00},1,100);
connection.controlTransfer(0x21,0x01,0x0200,0x0200,buffer,2,100);
connection.controlTransfer(0x21,0x01,0x0100,0x0500,new byte[]{0x00},1,100);
connection.controlTransfer(0x21,0x01,0x0201,0x0500,new byte[]{0x00,0x00},2,100);
connection.controlTransfer(0x21,0x01,0x0202,0x0500,new byte[]{0x00,0x00},2,100);
connection.releaseInterface(usbInterface);
connection.close();
I captured those addresses and commands on my USB port via wireshark, when I connect the headset to my Windows laptop and change the sidetone volume via setting. They works, as I can hear my voice in my headset.
But after I do that, other applications are not able to play sound on this device.
How can I change sidetone volume like showed above (or with other API in case I didnt know them) and still let the sound play properly?
Any protential help appreciated.
I have a college assignment to build Android app that communicates with Ubuntu (or any other Linux distribution), and streams audio via microphone and speakers both on PC and phone. Switching the direction of communication should be done on Android and script for listening on Bluetooth port on PC should be written in Python or some other lightweight language. It does not have to be full-duplex, only single-duplex.
Is the answer in the BluetoothA2dp Android profile or is there something else?
I'm common with making simple Android apps.
Thanks a lot!
Not sure if you still need the answer, but I am working on something similar.
Basically working with python on windows platform to record streaming audio from microphone of laptop then process the sound for ANC [ automatic noise cancellation ] and pass it through band-pass filter then output the audio stream to a Bluetooth device.
I would like to ultimately port this to smartphone, but for now prototyping with Python as that's lot easier.
While I am still early stage on the project, here are two piceses that may be helpful,
1) Stream Audio from microphone to speakers using sounddevice
Record external Audio and play back
Refer to soundaudio module installation details from here
http://python-sounddevice.readthedocs.org/en/0.3.1/
import sounddevice as sd
duration = 5 # seconds
myrecording = sd.rec(duration * fs, samplerate=fs, channels=2, dtype='float64')
print "Recording Audio"
sd.wait()
print "Audio recording complete , Play Audio"
sd.play(myrecording, fs)
sd.wait()
print "Play Audio Complete"
2)Communicate to bluetooth
Refer to details from here:
https://people.csail.mit.edu/albert/bluez-intro/c212.html
import bluetooth
target_name = "My Phone"
target_address = None
nearby_devices = bluetooth.discover_devices()
for bdaddr in nearby_devices:
if target_name == bluetooth.lookup_name( bdaddr ):
target_address = bdaddr
break
if target_address is not None:
print "found target bluetooth device with address ", target_address
else:
print "could not find target bluetooth device nearby"
I know I am simply quoting examples from these sites, You may refer to these sites to gain more insight.
Once I have a working prototype I will try to post it here in future.
I'm in the process of creating an Android VPN app and I'm hitting a brick wall whenever I try to deliver inbound multicast packets to a test app. When my VPN app writes an inbound multicast IP packet (destination address field of IP header has an address in the multicast range) to the file descriptor side of the TUN device I don't see the packet show up on the test app I'm using to receive multicast packets. However, I do see unicast packets being received by the test app. I also see my test app receiving multicast packets when the VPN app is not in the mix, so I know it's capable of receiving multicast.
I have my suspicion that the TUN device created by my VPN app does not enable multicast functionality by default, and I have not found any means exposed by Android to do so. Does anyone know if it's possible to enable multicast for a TUN device? Or, am I possibly missing something else entirely?
I use the following code to create the TUN device using the VpnService.Builder class
Builder builder = new Builder();
builder.setMtu( 1250 );
builder.addAddress( "2.3.1.1", 32 );
I use the following code to create the file descriptor to send IP packets to the TUN device
ParcelFileDescriptor parcelFileDescriptor = builder.setSession( "my_session" ).setConfigureIntent( myConfigureIntent ).establish();
FileOutputStream tunOut = new FileOutputStream( parcelFileDescriptor.getFileDescriptor() );
Basically, what I am trying to do right now is use an android device as an A2DP receiver and when pairing established, android plays sound that is received from a transmitter. I am worrying that if I use STP profile, it may cause delay of streaming. So, I want to use A2DP but is this possible to use an android device as a receiver of A2DP? and how to do it?
Since Android L the BlueDriod stack does support A2DP sink, but it is disabled by default.
To enable it do the following:
/* Enable bluetooth av sink. */
#define BTA_AV_SINK_INCLUDED TRUE
in /external/bluetooth/bluedroid/include/bt_target.h.
This enables sink support in the bluetooth stack.
Also you have to do this change:
<!-- Enable sink support. -->
<bool name="profile_supported_a2dp_sink">true</bool>
in /packages/apps/Bluetooth/res/values/config.xml. This enables the particular UI.
Now you can pair your devices and start streaming. Unfortunately you will hear no sound although you'll receive the packets. The reason is that there is no audio route for A2DP sink. In the commit message of this patch https://android-review.googlesource.com/#/c/98161/ you can find a sample implementation on how to fix this.
Here is a list of these changes:
https://android-review.googlesource.com/#/c/97832/
https://android-review.googlesource.com/#/c/97853/
https://android-review.googlesource.com/#/c/97833/
https://android-review.googlesource.com/#/c/98130/
https://android-review.googlesource.com/#/c/98131/
Yes. It is possible. I have done it in JB.
Android internally uses "Bluedroid" stack from Broadcomm for Bluetooth. Previously this stack did not have support for A2DP Sink Role (Which you mentioned as receiver). From Lollipop release, the A2DP Sink role profile has been added in Bluedroid.
But, it is not enabled to be used by framework/upper layer (Application).
You need to make changes in framework to enable it or 'use' it.
You may refer to the following files and relevant files in Android source code to enable it.
audio.h - put a new audio source
audio_policy.conf - put a new input source for a2dp 'inputs'
AudioManager.java
AudioPolicyManagerBase.cpp
AudioService.java
AudioSystem.java
BluetoothA2dp.java
MediaRecorder.java
A2DPStateMachine.java
etc. and implement it (this file list is not comprehensive, but you can figure it out if you have experience in relevant field).
When any stream connection is established, you will get callback in a2dp state machine and from there you have to start a thread to read the decoded PCM bytes from the 'new' audio source and send it to your media player.
SBC codec to PCM decoding will be done at the 'bluedroid' sbc decoder layer.
Build it and flash it to your phone and enjoy music.
EDIT:
Also, you may have make changes in A2DP SDP record in Bluedroid stack to advertise the A2DP Sink role.
You may-not be able to do it manually between 2 phones because to stream one device needs to be A2DP sink and other other A2DP source, Phones are typically only Source devices (Source of the stream that can stream to sink devices) , Sinks are Headsets or Bluetooth speakers.
I want to create an Android application that is capable of receiving an audio stream. I thought of using the A2DP profile, but is seems as if Android doesn't support A2DP sink. Looks like there are a lot of people that's searching for a solution for this problem. But what about receiving an ordinary bit stream, and then convert the data into audio in the application? I was thinking of receiving an PCM or Mp3 data stream via the RFCOMM (SPP Bluetooth profile), and then play it using AudioTrack.
First, how do I receive a bit stream on my Android phone via the RFCOMM? And is it possible to receive a bit stream via RFCOMM as a PCM or Mp3 stream?
Second, if it isn't possible to receive a bit stream via RFCOMM as a PCM or Mp3 stream, how do I convert the received bit stream into audio?
Third, how do I convert the received data into audio AND play the audio simultaneously, in "real time"? Can I just use onDataReceived?
To be clear, I'm not interested of using the A2DP profile! I want to stream the data via the RFCOMM (SPP Bluetooth profile). The received data stream will be in PCM or Mp3. I thought of writing my own app, but if anyone knows of an app to solve this I'd be glad to hear about it! I'm using Android 2.3 Gingerbread.
/Johnny
No. Trying to write an Android application that handles this will not be the solution. At least if you want to use A2DP Sink role.
The fact is that Android, as you mentioned it, does not implement the API calls to BlueZ (the bluetooth stack Android uses till Jelly Bean 4.1) regarding A2DP sink capabilities. You have to implement them yourself. I will try to guide you, as I was also interested in doing this my self in the near past.
Your bluetooth-enabled Android device is advertising itself as an A2DP source device by default. You have to change this first, so nearby devices may recognize your device as a sink. To do this, you must modify the audio.conf file (usally located in /etc/bluetooth/) and make sure the Enable key exists and the value Source is attached to this key, so you will get something like :
Enable=Source
Reboot, nearby devices should now recognize your device as an A2DP sink.
Now you will have to interact with BlueZ to react appropriately when an A2DP source device will start to stream audio to your phone.
Android and BlueZ are talking to each other via D-BUS. In fact, Android connects to the DBUS_SYSTEM channel and listens to every BlueZ advertisement, such as events, file descriptors ...
I remember having successfully bound my self using a native application to this d-bus channel and got access to the various events BlueZ was posting. This is relatively easy to achieve using as reference, the BlueZ API available here. If you go this way, you will have to build a native application (C/C++) and compile it for your platform. You must be able to do this using the Android NDK.
If you find it difficult to use D-BUS, you can try this Java library I just found that handles the communication to D-BUS for you : http://jbluez.sourceforge.net/. I have never used it but it is worth a try in my opinion.
What you really have to do is find out when an A2DP source device is paired to your phone and when he starts to stream music. You can retrieve these events through D-BUS. Once somebody will try to stream music, you need to tell BlueZ that your native application is going to handle it. There is a pretty good document that explains the flow of events that you should handle to do this. This document is accessible here. The part you're interested in comes on page 7. The sink application in the given example is PulseAudio but it could be your application as well.
BlueZ will forward you a UNIX socket when you will call the org.bluez.MediaTransport.Acquire method. Reading on this socket will give you the data that are currently streamed by the remote device. But I remember having been told by a guy working on the BlueZ stack that the data read on this socket are not PCM pure audio, but encoded audio content instead. The data are generally encoded in a format called SBC (Low Complexity Subband Coding).
Decoding SBC is not very difficult, you can find a decoder right here.
The ultimate step would be to forward the PCM audio to your speakers.
To prevent you from getting stuck and in order to test your application in an easier manner, you can use the d-bus binary that should be available on your Android system. He is located in /system/bin.
Quick tests you can make before doing anything of the above might be :
Get Devices list :
dbus-send --system --dest=org.bluez --print-reply /
org.bluez.Manager.GetProperties
This returns an array of adapters with their paths. Once you have these path(s) you can retrieve the list of all the bluetooth devices paired with your adapter(s).
Get paired devices :
dbus-send --system --print-reply --dest=org.bluez
/org/bluez/{pid}/hci0 org.bluez.Adapter.GetProperties
This gives you the list of paired devices whithin the Devices array field.
Once you have the list of devices paired to your Bluetooth Adapter, you can know if it is connected to the AudioSource interface.
Get the devices connected to the AudioSource interface :
dbus-send --system --print-reply --dest=org.bluez
/org/bluez/{pid}/hci0/dev_XX_XX_XX_XX_XX_XX
org.bluez.AudioSource.GetProperties
org.bluez.Manager.GetProperties
Hope this helps.
Another work around is using HandsFreeProfile.
in Android, BluetoothHeadset is working on that.
Wait until status changed to BluetoothHeadset.STATE_AUDIO_CONNECTED.
then you can record audio from bluetooth headset.
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mMediaRecorder.setOutputFile(mFilename);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
try {
mMediaRecorder.prepare();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
mMediaRecorder.start();
[Irrelevant but works] This hack serves only mp3 streaming via WIFI hotspot (I use it in my car which has only AUX input):
Install the app AirSong,
Turn on wifi hotspot,
Connect the other device to that hotspot,
Access 192.168.43.1:8088 from the device's browser and you are on.
(wondering why "192.168.43.1" only? because thats the default gateway of any device connected to Android Hotspot)
audio.conf seems to be missing in Android 4.2.2?
To receive pcm audio stream via rfcomm , you can use code flow as a hint explained (Reading Audio file in C and forwarding over bluetooth to play in Android Audio track) , with a change . change freq used while initializing from 44100 to 22050
AudioTrack track = new AudioTrack(AudioManager.STREAM_MUSIC,22050,AudioFormat.CHANNEL_OUT_MONO,AudioFormat.ENCODING_PCM_8BIT,10000, AudioTrack.MODE_STREAM);
note:This streaming still consists some noise but your
"receiving an PCM data stream via the RFCOMM (SPP Bluetooth profile), and then play it using AudioTrack."
will work.