I know we can play mp3 file in MediaPlayer.
But can we play mp3+g on android??
I saw in the documentation on android, but i didn't see it.
http://developer.android.com/guide/appendix/media-formats.html
Is there any work around or library to do this?
Thanks
I don't "think" that Android is going to support mp3+g playback anytime soon. That being said an mp3+g "file" should either be one zipped file(with two files inside) or two separate files named the same with exception of the file extension. So other then playing the MP3 there is really nothing else that MediaPLayer can do, and changing MediaPlayer int the android framework to get this to work would not be portable from device to device.
Workaround 1
Use FFMPEG to transcode and mux these files to a different format that is supported such as mp4. Here is an example of someone using ffmpeg to mux mp3+g into FLV.
Workaround 2
Another option would be to use Android For VLC which is in pre-alpha found here. Now I'm not sure that VLC for android will support mp3+g, but libvlc does support decoding of the two files so I'm guessing it would work, or you could alter the code a bit to get it to work. I have checked out the VLC for Android code recently and I have to say its a cpu hog but since mp3 and cdg are generally smaller less cpu intensive files I think that android devices could handle the work load using VLC.
Workaround 3
Now as far as more complex options you could utilize the Android NDK and create a decoder yourself (This would take you a lot of time).
Hope some of this helps you.
I have found the solution..
http://code.google.com/p/cdg-toolkit/
It was written in java so we should porting it first to Android if you want to use it.
Related
I'm trying to decode a raw h264 stream on "older" Android versions.
I've tried MediaPlayer class and does not seem to support the stream format.
I can see the stream on other Cam Viewer apps from the market, so I figure there must be a way to do it, probably using the NDK.
I've read about OpenMAX and Stagefright, but couldn't find a working example about streaming.
Could someone please point me in the right direction?
Also, I'm reading in several places about "frameworks/av/include/media/stagefright/MediaSource.h" and other sources, but they don't seem to be either in the regular SDK or the NDK.
Where is this source located? is there another sdk?
Thanks in advance.
Update: I'm receiving a rtsp connection.
If you wish to perform only a simple experiment to verify certain functionality, you can consider employing the command line stagefright utility. Please do consider this condition where your streaming input may not be supported.
If you wish to build a more comprehensive player pipeline, you can consider the handling for rtsp as in here or http as in here. Please note that NuCachedSource2 implementation is essential for streaming input as this provides a page cache implementation which acts as a jitter for the streaming data.
Please do note one critical point: Command line stagefright utility doesn't render to the screen. Hence, if you wish to render, you will to implement the complete playback pipeline with rendering.
On a related note, if your input is streaming input, the standard player implementation does have support for streaming inputs as can be observed here. Did you face any issues with the same?
As fadden has already pointed out, your work is made far more simpler with the introduction of MediaCodec in Android 4.x.
You should use third-party libs like android-h264-decoder which uses JNI for increasing the performance! Also look at this lib Intel
Update: Media codec wasn't exposed until API 16 (Android 4.1), so that won't work for a 2.3.3 device.
Stagefright and OpenMAX IL were (and still are) internal components of Android. You can find the code (including MediaSource.h) at https://android.googlesource.com/platform/frameworks/av/+/master Note that the media framework has moved to a separate "tree" frameworks/av only recently. Before it was part of 'frameworks/base', e.g. https://android.googlesource.com/platform/frameworks/base/+/gingerbread/media/
In Android ICS and later, a new OpenMax IL API version is in use, making old binary blobs useless/unused. This leads to older devices that otherwise run ICS just fine and dandy to have broken video playback (YouTube HQ and IMBD, for example) because Androids fallback software decoder sucks when compared to what ffmpeg can do on the same device (I tested MXPlayer+arm6vfp ffmpeg and a 720p movie played back great).
I am trying to dig through the Android source code to see where and what exactly I could add/replace code to allow the ffmpeg library's awesomeness to be used. The problem is I don't know exactly what code is being used in for example the YouTube app to decode video, or how that's decided.
So I have two options as far as I can tell:
Figure out the current software decoder being used, and try to wrap its external interface around ffmpeg, effectively replacing the slow software decoder currently used. The end result would be a single .so I could push to the device.
Figure out how to trick Android into thinking an OMX library based on ffmpeg (I have built one succesfully for Android: limoa) and add this somewhere to the list of considered libraries (or better: replace the unusable hardware codec).
As an extension, I'd like to also make camcorder video encoding work through this, so a true integrated solution would be very much wanted. The question is: how, and where, and what? Searching the Android source tree gives numerous counts of "H264" and related stuff in many different places. I need the lowest and simplest possible, so I can simply wrap the hypothetical decode(buffer) function call to use ffmpeg (libavcodec).
It seems to me that this presentation ("Integrating a Hardware Video Codec into Android Stagefright using OpenMAX IL") is exactly what you'd like to do. Good luck with your project!
I've a requirement where I need to transcode small video clips shot from Native camera app to lower bitrate/resolution Mp4 which is shreable via email etc.
What is the best way to transcode/convert the video on device itself. FFMPEG or any other library?
p.s. I know this is an overkill for the device but client leaves me with no option. He doesn't care about battery or time it takes. I'm targeting this for quad-cores, where CPU is not a problem.
Your best bet would be to use something like ffmpeg which has been ported to Android (see this SO post: ffmpeg for a android (using tutorial: "ffmpeg and Android.mk") and the ffmpeg port for android which is here: http://bambuser.com/opensource). You'll have to use JNI etc, but that will save you the hassle of dealing with the byte stream yourself.
Haven't tried it on Android myself, so YMMV:
Is there a Java API for mp4 files?
http://code.google.com/p/mp4parser/
If you're recording on-device, why not set the expected format from your code? It appears the api lets you set video size, framerate etc. in the MediaRecorder class.
I have developed an HLS player for Android 2.3. It works. However, I am finding that certain Android devices lack support for .ts files. On these phones my player does not work. So, my question is this: Is there a way that I can include support for these files within my app (perhaps a codec or a library of some sort)? After exhaustive searching, I'm really not sure where to go.
Thanks.
Try to port FFMPEG to Android with the NDK, I think that's the best solution for your playing issues. I'm now in that way, I could report you my advances .
You can take a look at vitamio. I think that your player will work if vitamio is installed on the the devices that don't already support the ts files.
I am develping a media player for my learning purpose and I want to have crossfading feature in media player app. But I don't have clue where to start from. i tried searching on inernet but no luck. I am using Android MediaPlayer class for all media player related operation. Anyone know any workaround to achieve the same.
thanks for your support
Try to use the AudioTrack instead of MediaPlayer. Generally, I'd suggest the following plan:
Learn some sources for an app that uses AudioTrack. A good player can be found here
This is an aac audio player that uses JNI for aac-audio decoding.
Find a MP3 decoding library. The library should be a Java one (look at this for example, or it is possible there are another libraries (I did not use such java libraries)) or a C/C++ library (in this case you also will use it through JNI).
When you will get simple working MP3 player, add manual crossfading (this should be easy, if you aware of basics of digital audio).
Try to use two objects of MediaPlayer one after another with crossfade, like in this class https://github.com/psaravan/JamsMusicPlayer/blob/f165057dd664727ed06b9fac2c27557e5fb7e7ee/jamsMusicPlayer/src/main/java/com/jams/music/player/Services/AudioPlaybackService.java
When second mediaPlayer stars first pauses for while (On some devices mostly on samsung )so transition is not smooth. MediaPlayer has this issue and its been reported to google since long ago but still not resolved(https://issuetracker.google.com/issues/36931073) so we can do nothing about it. So I used Exoplayer for playing audio and it works very smoothly without any pause.