There are some posts about this, but so far, i haven't seen any good answer. Is there a way i can stream audio from mms:// uris with Android?. MediaPlayer seems not to like these streams. Also changing mms:// with http or rtsp doesn't work either. Did someone find a workaround to this?. Thanks!
Download Android NDK
Then download modified libmms and libffmpeg at http://radiotime.com/apps/android.aspx
basically you has to have 2 threads:
1st thread for downloading the audio data from mms station then write to a wma file
2nd thread decode this wma to PCM data using the libffmpeg.
To save some time you can use the following library for playing mms streams:
FFMpegMediaPlayer.
It has the same interface as the android MediaPlayer so its easy to work with and it comes with optional prebuilt binaries too.
I use it in my own mms streaming application and I haven't had any problems.
Related
I am building an application in which I need to trim videos. It is possible to do this using ffmpeg, but I can't use it because it uses the gpl license.
I tried using mediaCodec but can't use the codes I found.
How can i trim videos on android?
I had to develop trim functionality into my app a few months back and found that FFMPEG is very heavy and wasn't as accurate as MediaCodec.
None of the examples helped me but as I was developing in Kotlin I had to rewrite it anyway.
Here is the breakdown of how to use MediaCodec:
Pass the file to your mediacodec class
Extract the video from a file
Create your buffer size
Seek to where you want to file to be trimmed from or to
Mux your audio and video together
We tried to find a way to do the start and finish times together but we ended up just duplicating the clip first and passing both in with a start and and end time.
You'll need to post your code and show where you're having the issue with MediaCodec for people to help you.
I have seen many questions related to this. Nevertheless there is not an answer for mine I think.
I would like to use an already coded RTSP Client on Android to use with MediaCodec in order to capture a RTSP stream in H264 to then decode and display it. I have used VideoView and MediaPlayer which are well-known to support RTSP streaming in the .setDataSource method (file or rtsp/http path) (unlike MediaExtractor which only supports file or http), but the latency is to high for my purposes.
I would like to use MediaExtractor, but because of that limitation on the setDataSource method it seems to be not an option. Given this, I am searching for some help or examples (tutorial?) that I could use as RTSP Client on Android, or if someone has used MediaExtractor in some way to capture the RTSP stream its help is more than welcome as well.
Thank you so much guys!
rojiark
You can try https://github.com/fyhertz/libstreaming
You should know though that is LGPL, which means the rest of your project will become LGPL and if you distribute the application you must also distribute the source code if requested.
I am trying to stream incoming AMR_NB. I can't use MediaPlayer directly because it requires a seekable file. I would like to use MediaCodec, but to use MediaCodec I need (I think... please correct me!) MediaExtractor to give me things like the presentationTime. Is that true? Can I use MediaCodec without MediaExtractor?
MediaExtractor seems to require seekable files. The documentation only specifically says so for one of the setDataSource operations but when I tried to use any of the others it failed due to failed seek attempts.
So, what can I do to get my incoming AMR stream to play? I am aware of a scheme where by you save incoming data to a file and periodically make a copy of that file to feed to MediaPlayer but I'd really prefer to find a real honest streaming solution.
Is it possible to use MediaCodec without using MediaExtractor? If so how do I find presentation time and the string to pass to MediaCodec.createDecoderByType? The documentation SAYS that "audio/3gpp" is what I want but when I attempt to use that I get the following error:
codec = MediaCodec.createDecoderByType("audio/3gpp");
01-02 03:59:36.980: E/OMXMaster(21605): A component of name 'OMX.qcom.audio.decoder.aac' already exists, ignoring this one.
So I'm not sure how to get at MediaCodec either.
"I can't use MediaPlayer directly because it requires a seekable file" This is not generally true. I would like you to try it on your stream and report exactly what happens.
"Can I use MediaCodec without MediaExtractor?" I doubt it: I believe they are designed to be used together.
I have used these components to play streams. However, the MediaExtractor has limitations that are not documented ( as far as I know ). So use a little proxy server to feed it things it can digest. And I have 1 thread to run the MediaExtractor and another to take output from the the MediaCodec. Then i have to avoid deadlocks and cope with snchronization. But it is not that bad provided you just want to play forwards only. Then you have only the problem of how to stop!
I advise that you try MediaPlayer first. Otherwise, if you are keen enough to try the MediaExtractor, we could share our discoveries about what it will and wont digest. Don't take anything for granted. For example it seems it will play my MP3 files, but cannot discover their duration, or seek on them!
I'd like to write an app that merges multiple images into a movie on Android. JMF has a basic implementation (JpegImagesToMovie). But, JMF isn't supported on Dalvik.
Is there an alternative library that I can use for this ? Or if there is no library available, does anyone have any pointers for what I need to research to implement myself.
Rgds, Kevin.
I'm not aware of any pure-Java video encoders, and the built-in video encoder in Android appears to be limited to capturing video from the camera alone, rather than a custom input source.
You could look at writing a multi-part JPEG (quite rare but well supported) writer, or even an MJPEG (used by many digicams) encoder.
Short version: What is the best way to get data encoded in an MP3 (and ideally in an
AAC/Ogg/WMA) into a Java array or ByteBuffer that I can then
manipulate?
I'm putting together a program that has slowing down and speeding up
sound files as one of its features. This works fine for WAV files,
which are a header plus the exact binary data that needs to be sent to
the speaker, and now I need to implement it for MP3 (ideally, this
would also support AAC, Ogg, and WMA, but since those are less popular
formats this is not required). Android does not expose an interface
to decode the MP3 without playing it, so I need to create that
interface.
Three options present themselves, though I'm open to others:
1) Write my own decoder. I already have a functional frame detector
that I was hoping to use for option (3), and now should only need to
implement the Huffman decoding tables.
2) Use JLayer, or an equivalent Java library, to handle the
decoding. I'm not entirely clear on what the license ramifications
are here.
3) Connect to the libmedia library/MediaPlayerService. This is what
SoundPool does, and the amount of use of that service make me believe
that while it's officially unstable, that implementation isn't going
anywhere. This means writing JNI code to connect to the service, but
I'm finding that that's a deep rabbit hole. At the surface, I'm
having trouble with the sp<> template.
I did that with libmad and the NDK. JLayer is way to slow and the media framework is a moving target. You can find info and source code at http://apistudios.com/hosted/marzec/badlogic/wordpress/?p=231
I have not tried it, but mp3transform is LGPL.