I am working on the android video conference application. for that we are using AudioRecord for recording and get the buffer from the AudioRecord using
"read(buffer,readBufferSize, size - readBufferSize);"
For playing audio we are using AudioTrack. while playing and recording we are getting our self echo.
How to remove this Echo programmatically.
I think you should be setting the source as VOICE_COMMUNICATION. This setting should enable Android to use its internal AEC. If you need to support multiple devices and Android versions then test the AEC results on several devices and Android versions since results might not be the same across the board. You can also take a look at this blog post
Related
Working on Android 4.0+ above.
I am in process of analyzing ways to live stream my camera video to Window PC using RTP , encoding MPEG-2.
Is there readily available "rtp-server" in android 4.0+ ?
Is following true:: "The Android platform lacks support for
streaming protocol, which makes it difficult to stream live audio /
video to Android enabled devices." extracted from website
Currently I analyzed used the ffserver from the ffmpeg
libraries, but the FPS is < 5. which is far slow. Did any one
explored other solution which has more FPS?
Did anybody tried using StageFright for same? Capturing raw data
from camera and sending it to stagefright framework for encoding and
then streaming the same using RTP ??
Many Thanks.
The answers to your questions are as below. Though the links are related to Android 4.2.2, the same is true for Android 4.0 also.
Yes, there is a RTP transmitter available. You could look at this example in MyTransmitter as a starting point or you can consider using the standard recorder as in startRTPRecording.
You can stream data via RTP from an Android device to an external sink or you could have a different use-case as in Miracast a.k.a. Wi-Fi Display. However, streaming from one android device to another device through Wi-Fi Direct is still not completely enabled. The latter statement is mainly coming from Miracast scenario.
You can use the standard android software, which is capable of high resolution recording and transmission. This is mainly dependent on the underlying hardware as the overhead from software stack is not very high.
Yes. This is already answered in Q1 above.
I'm trying to build an Android application that takes in an RTSP stream and displays the audio/video. I've been using VideoView and still getting a delay between 3 and 10 seconds from real time. I need this delay to be under 3 seconds.
On a PC, I can see the same RTSP stream using VLC with only a 1-2 second delay. How can I replicate this on Android? Even when I use other apps like MoboPlayer/RockPlayer, the delay is still 3 to 10 seconds. (If it matters, I'm connecting to the RTSP stream wirelessly on both PC and Android)
I've started looking into using FFmpeg for Android as well as the Gstreamer SDK for Android as alternatives to MediaPlayer, but they're both hard to work with for a novice like myself and I'm running into multiple problems.
Any and all information would be greatly appreciated!
First I think delay is because of initial buffer size which is uncontrollable on Android hardware MediaPlayer. This can be fixed by using custom SW media player as FFMpeg (AVconv now) or GStreamer. But it will cost you performance (depend on video stream compression,resolution, fps) and device battery life.
And of course native development using FFMpeg or similar C/C++ framework is complex and require a lot of time and experience.
Another option: you can try new Android 4.1 Java API to access video and audio decoder. And build your own RTSP player using this API and some code to load video stream from RTSP server (probably some Java library already exist).
Android 4.1 Multimedia
I'm trying to take a stream from a webcam and stream it to an android device. I use Gstreamer to grab the video and stream it out through a TCP server. That part of it works fine. The trouble I'm running into is that I need to make a custom app to receive the stream on the android and I can't get gst-android to compile (For reasons unknown to me, the adb is not a runnable, thus I can't set up the flingersinks.) Any suggestions? Is there something other and gst-android that I can use for this?
Which android version are you targeting? As far as I know, the ndk version of gstreamer will still have problems to render video as no one contributed a working video sink. The surfaceflinger api is not available to ndk apps :/
Is there any easy way in Android 4.0 ICS to stream video from the camera to an other phone? Especially with API functions.
If not, can you show me an example how to capture the camera stream into the memory and send it via UDP. (ex. FileDescriptor?)
Thanks for the anwsers.
You can have MediaRecorder write the stream to a socket instead of a file but none of the selectable container formats works for streaming.
They all have headers that are written after the recording is done.
MPEG2-TS is not officially supported and doesn't work on nearly all devices.
No idea how the ustream -app does it.
Is it possible to perform adaptive (multibitrate) streaming onto an Android device? If yes, how to do that?
If you have 4.0 or 3.2 you just use access the adaptive stream as you would any other video. Literally.
It's a HTTP access.
So if you use as a data source //mywebsite/video1.mp4 you woulduse as a data source the equivalent //mywebsite/video1.m3u8. Now, I'm not including any discussion on how you create your streaming file but only how you would access it.
All the magic happens within the client (ex: mediaplayer, videoview) supported on 4.0 and 3.2. For the record, you may be able to access and run streaming segments (.m3u8 files) on earlier versions of Android because the manufactures have sometimes played around with the code. But I haven't found any that actually adapt. They usually stick to the first segment they run or default to the lower bitrate segment in the bunch and stay there regardless of bitrate.