I am trying to do a kind of walkie talkie in android. I did using the classes audiorecord and audiotrack but these are not suitable to transmit the PCM data. I would like to use other codec like AMR which are less bandwith consuming. Can you tell me how to implement it, I mean which classes or method to convert PCM to other codec?
Many thanks for your support.
There is no built-in solution for this one in Android, and you will have to bring your own.
Such a codec will usually run in the NDK, so some stitching using JNI will be required as well.
There should be several such libraries available - free and for a price.
Search google for AMR codec. Focus on those running on ARM CPUs or those that are specific to Android.
Related
What is the best (performance wise) way to get and stream a video from an android device's camera to a PC?
I have seen this question asked here before and there exist a few open source programs that do just that, but there exist multiple ways from which I don't know which one is the best!
for example:
Should the android part be written in c++ or java (performance/api wise)?
Which api should I use to get the video from camera?
What is the best way to stream the video?
I don't intend to support old android versions (<4.x), so if the best way/api is relatively new it's fine by me.
I'm not familiar with Android development but I'll try to answer.
I suppose that the actual encoding of the raw image data is probably done on hardware chip (otherwise software encoding would probably kill your battery) and it looks like MediaCodec class is exactly what you need. I suppose you want to implement some kind of live streaming service and the latency is important. If so, then you should stick to UDP based transmission methods. Using RTP protocol or MPEG-TS container format would be the best choice for this purpose. Of course you can also use TCP based methods for streaming like HLS or DASH (both of them use HTTP).
You should also take a look at Table 1 Core media format and codec support:
It tells us for example that using H.264 AVC Encoder supports MPEG-TS container and that HLS version 3 is also supported for Android 4.0 and above.
I'm trying to decode a raw h264 stream on "older" Android versions.
I've tried MediaPlayer class and does not seem to support the stream format.
I can see the stream on other Cam Viewer apps from the market, so I figure there must be a way to do it, probably using the NDK.
I've read about OpenMAX and Stagefright, but couldn't find a working example about streaming.
Could someone please point me in the right direction?
Also, I'm reading in several places about "frameworks/av/include/media/stagefright/MediaSource.h" and other sources, but they don't seem to be either in the regular SDK or the NDK.
Where is this source located? is there another sdk?
Thanks in advance.
Update: I'm receiving a rtsp connection.
If you wish to perform only a simple experiment to verify certain functionality, you can consider employing the command line stagefright utility. Please do consider this condition where your streaming input may not be supported.
If you wish to build a more comprehensive player pipeline, you can consider the handling for rtsp as in here or http as in here. Please note that NuCachedSource2 implementation is essential for streaming input as this provides a page cache implementation which acts as a jitter for the streaming data.
Please do note one critical point: Command line stagefright utility doesn't render to the screen. Hence, if you wish to render, you will to implement the complete playback pipeline with rendering.
On a related note, if your input is streaming input, the standard player implementation does have support for streaming inputs as can be observed here. Did you face any issues with the same?
As fadden has already pointed out, your work is made far more simpler with the introduction of MediaCodec in Android 4.x.
You should use third-party libs like android-h264-decoder which uses JNI for increasing the performance! Also look at this lib Intel
Update: Media codec wasn't exposed until API 16 (Android 4.1), so that won't work for a 2.3.3 device.
Stagefright and OpenMAX IL were (and still are) internal components of Android. You can find the code (including MediaSource.h) at https://android.googlesource.com/platform/frameworks/av/+/master Note that the media framework has moved to a separate "tree" frameworks/av only recently. Before it was part of 'frameworks/base', e.g. https://android.googlesource.com/platform/frameworks/base/+/gingerbread/media/
I've read a lot of questions on stackoverflow and other pages about this topic, but didn't find a real up-to-date solution:
In an Android-App I've got two audio files (local file system), which are encoded in mp3, ogg or wav.
I just want to play them exactly synchronously, have seeking possibilities and control the volume of each single track. Using MediaPlayer this isn't possible because of the well known latency issues in Android.
So I think having two Audio-Player-Instances (of whatever library) will allways result in bad latencies, so it seems not to be the solution.
So in my opinion the only solution would be to mix together the audio inputs to a somewhat mixed input, which can be played by one Player. I read a lot about Androids AudioTrack and buffers and the OpenSL ES implementation, but allways ended with the notice: buffers only support PCM raw audio data. Ok, so I have to decode the mp3/ogg by myself?
My Question now is: Is there any library that can help me to a) do exactly what I want with a simple API or b) decode mp3/ogg to memory to use that data with AudioTrack or OpenSL?
If it's native or Java is unimportant, it just has to work.
The minimum API-Level is 15+ (Android 4.0.3, most current Version while creating this question).
I am receiving the MPEG-TS (MPEG transport stream) packets with the multiplexed H.264 video and AAC audio streams. I need to be able to show the audio and video on the Android phone. My assumption is that I need:
MPEG-TS de-multiplexer
AAC decoder
H.264 decoder
Synchronize the audio and video playback
Assuming that I am right then (in Android 2.x) MPEG-TS de-multiplexer is not part of the OS and must be ported, both AAC and H.264 decoder are part of the Android OS, but I am not sure if they have interface, which allows passing the data in buffers and if they allow mutual timing synchronization. In the worst case those components must be ported here as well.
Can you give me some advices where to start? I was thinking about the FFMPEG porting. Are there any other ways?
Regards,
STeN
Android 4.x has OpenMAX which can play TS with H264 and AAC. You don't even need to worry about synchronisation of audio and video.
Look at the nativemedia sample in the NDK.
If you want to support previous versions of Android, then ffmpeg might be a good choice, but it the maximum it can give you is just decoded video frames in RGB or any other format and decoded audio in PCM. Then you will have to implement renderer and audio playback yourself. I would recommend reading this tutorial - http://dranger.com/ffmpeg/. It is not android specific but it will give you idea how video play works.
You may refer to the android-ffmpeg project on github.
https://github.com/guardianproject/android-ffmpeg
In Gingerbread ( 2.3 ), actually there is a MPEG TS parser in the stagefright framework that you could use. Also, I believe it is well integrated with H264 and AAC decoders. MPEG TS parser is not advertised anywhere but the support is silently sitting there. I believe they have brought it to support Apple HTTP Live streaming in HC or later version but the code is sitting there in the Gingerbread ( 2.3 ) codebase as well. With a minor modification in the framework, you can playback http live streaming ( which actually sends TS packets). I guess the above information would be helpful for you.
Vibgyor
(DISCLAIMER: I'm personally involved in developing the free and open source program linked below)
A static version of FFMpeg (both library and commandline) is provided by ZShaolin http://dyne.org/software/zshaolin also contains other media conversion tools.
Its use can facilitate scripting experiments without having to compile FFMpeg from scratch.
I'm very new to Android. Now i need to work on Adding my own codec to Android.I mean, I want to know what all the steps i need to take to add my own codec to android.
I'm very fresh to this i need some basic things, so can someone please explain the steps I need to take in order to add a new codec to Android?
This is virtually impossible to do in a portable way as all audio and video codecs are compiled at the platform level (due to the fact that most of the time they require hardware specific acceleration)
If you are only interested on this working on a specific hardware platform and have an unlocked bootloader (So you can boot a custom build of Android) you can compile the full Android platform from scratch using the AOSP as a base.
Depending on which version of Android you're targeting you're looking at adding code to either Opencore or Stagefright (The subsystems that Android uses for A/V decoding and parsing) here you can add audio decoders, audio encoders, video encoders, video decoders and container parsers.
Here is some discussion of adding to Stagefright:
http://freepine.blogspot.com/2010/01/overview-of-stagefrighter-player.html
http://groups.google.com/group/android-porting/msg/5d88e76845a22bbb
However unless the encoding scheme you wish to support is very simple (What are you wanting to add?) it is likely to be too CPU intensive for most Android devices to run without being able to offload some of the work to another system (like the radio chipset or the GPU).
In Android Framework, mediacodec is implemented as Stagefright, a media playback engine at the native level that has built-in software-based codecs for popular media formats. Stagefright also supports integration with custom hardware codecs provided as OpenMAX component. Here is the article which summaries the steps.
CHECK HERE