Android: Implementing a VoIP program - android

I have some design questions that I want to discuss with people interested in helping me. I am planning to develop a simple VoIP program that allows two Android phones in the same network to use VoIP. My goal is simply to capture sound, send the data with UDP, receive UDP data and play sound.
My current design is to have 2 threads: one captures the microphone and sends the data; the other one receives bytes and plays them.
I was starting to implement that using MediaPlayer and MediaRecorder. The issue that came up is how do I record and play sound? By that, I would like to know if I need to use a file, although that seems slow, or if there is anyway to have the recording automatically sent to my UDP socket please?
Basically, I wonder if I have to record to a file, then to be able to play it, or if I could just pass a socket (for recording and playing).
Does anyone has any suggestion please?
Thank you very much

MediaRecorder needs an FD so, you can use sockets as well. I dont see any issues with that. It all depends on how you would design your system.

Don't use those classes for streaming audio - use AudioTrack and AudioRecord instead.
They provide the functionality you need for playing and recording raw audio data, without dealing with an FD.
When you record a frame (either byte[] or short[]), wrap it with a UDP packet.
When you receive a UDP packet, unpack the relevant byte[] or short[] and play it.

Related

How to receive a video stream from source in my android app

I have a source that transmits video in H264 format (in this case it is Colasoft Packet Player which transmits video stream through IP:PORT) and I want to be able to listen and receive the video stream in my Android app.
I read a lot of things across the internet about Socket, DatagramSocket and AudioManager, but I'm totally confused about what exactly I need and how to implement that.
What I need is to be able to capture the video frame by frame.
I would love to get some help.
You can use the VLC Android library
And here there's an explanation on how to embed it into your app.
You can let ffmpeg do this job.
Have a look at:
https://github.com/alphons/SegmentationStreaming

Audio streaming using Nearby Connections API

I'm trying to stream audio across several devices using the Nearby Connections API but I'm not really sure if this is really possible/recommendable.
What I want to do is broadcast the audio files (both songs stored on the phone and from apps such as Google Music, Spotify ...) to the other devices connected and so they can start playing the songs while the receive all the data chunks of the songs.
I think with the Nearby Connections API we can only send 4KB payload chunks when we call Nearby.Connections.sendReliableMessage() so what I'm doing so far is call that function as many times as required sending 4KB chunks each time until I manage to deliver the entire file.For the onMessageReceived () Listener what I do is to store all the chunks that I receive in a byte array so once all the chunks have been transferred I can play back the song from the byte array file.
With the approach I'm taking I guess I'd be able to reproduce the song once I've transferred it on its totality, but I'd like to reproduce the songs while I'm actually receiving the data chunks, and in a synchronized manner with all the devices.
Does this makes sense to you guys? Is it the right approach? Is there any other more effective way of doing this? (I already know about the option of streaming audio using Wifi-Direct, but I'd like to use Nearby)
The guy in this tutorial had a similar problem with chunks of audio.
He shows how to play the song, while the bytes are still downloaded and the audio file is built.
Maybe you can utilize the "Incremental Media Download"-part of the tutorial.
Quote:
This is where the magic happens as we download media content from the the url stream until we have enough content buffered to start the MediaPlayer. We then let the MediaPlayer play in the background while we download the remaining audio. If the MediaPlayer reaches the end of the buffered audio, then we transfer any newly downloaded audio to the MediaPlayer and let it start playing again.
Things get a little tricky here because:
(a) The MediaPlayer seems to lock the file so we can’t simply append our content to the existing file.
...
If this doesn't work I would just use the nearby-connection to exchange IP Adresses and go for a Wifi-Direct solution.
I hope this helps and I'd love to hear what your final solution looks like!
I implemented this several years ago for live audio/video packets sent in a serial stream to an Android 4.0 device. It would work the same for audio (or video) packets being streamed over the Nearby connections API.
The solution was to run a http streaming server from within the Android App, then consume this using the Android media player API with its http streaming capabilities (or you can embed ExoPlayer in your app if you prefer as it also supports http streaming).
This was achieved by piping the data stream directly into a FFSERVER process running on the device. The Android NDK was used to create and manage the named pipe required as input into FFSERVER.
As this was done a few years ago I have not tested this on versions of Android 4.1+. Anyone who does this will need to adhere to the FFmpeg GPL/LGPL license when building and distributing FFSERVER.

Usage of RTP packets in Android Streaming application

Updating my previous question...
Currently I am working on a live media streaming application. I have been able to use the Media Recorder class to record medias and store it in a file. Currently I am trying to stream this media from android to Adobe Media streamer using RTP. What I have done us using the parcel file descriptor I have packaged the video contents which are captured from the device camera. Now I would want to use the RTPPackets class provided by Sipdroid to convert this packets into RTP packets.
I am having problems integrating this RTP Packets and using the same in to my application as I see there are multiple steps to be done
1) Creating RTP packets, even though I have got the code for RTP packetizer I am not exactly sure how to use the same and where to use this. Because I clearly not sure on what would be the default payload value, ssrc and ccrc values. May the first time these will carry the default values but the second time onward what would set the values for these parameters would it be the parcel file descriptor?
2) Creating a simple server kind of code on the mobile which continues to create the RTP packets and keep sending them to the Adobe media server?
Any help would be much appreciated.
Thanks,
K.Saravanan.

How do I play synthesized sounds on Android?

I wrote an iPhone app some time ago that creates sound programatically. It uses an AudioQueue to generate sound. With the AudioQueue, I can register for a callback whenever the system needs sound, and respond by filling a buffer with raw audio data. The buffers are small, so the sound can respond to user inputs with reasonably low latency.
I'd like to do a similar app on Android, but I'm not sure how. The MediaPlayer and SoundPool classes seems to be for playing canned media from files, which is not what I need. The JetPlayer appears to be some sort of MIDI playback engine.
Is there an equivalent to AudioQueue in the Android Java API? Do I have to use native code to accomplish what I want?
Thanks.
With the AudioQueue, I can register for a callback whenever the system needs sound, and respond by filling a buffer with raw audio data.
The closest analogy to this in Android is AudioTrack. Rather than the callback (pull) mechanism you are using, AudioTrack is more of a push model, where you keep writing to the track (presumably in a background thread) using blocking calls.

Read an audio stream during (GSM) phone call

Is it possible to read an audio stream during (GSM) phone call? I would like to write an encoding application, and I do not want to go with SIP&VoIP. Thank you.
This will be phone and OS dependent and there are several apps that claim they record audio (Total Recall, Record my call on Android) but they generally seem to record via the microphone meaning the far end sound is poor.
I don't believe either the apple or android api's support access to the raw voice stream today.
Something to be aware of also is that it is not always legal to do this without informing the other party (i.e. the person on the other end of the call that you are planning to 'capture' the voice stream somehow) in many places - this may not be relevant for your particular plans but worth mentioning anyway.
If you have the option of doing the work in the network or on a PABX then you can create a basic (if not very efficient) solution by simply creating a three way (or conference) call.

Categories

Resources