I'm attempting to stream from a URL using Android's built in MediaPlayer class. However, I also need to send a special header along with the URL. Is this possible without having to rewrite the whole steaming process?
If it's not possible to send a header, I would need to stream the file manually. However, it appears that the MediaPlayer class locks the file you are writing to when it begins reading the file. This means you cant just simply continue writing to the file while reading from it. I've seen the 'double buffer' method however that results in choppy playback. Any suggestions?
I asked a question recently about alternatives to the double buffer method you mentioned:
is-there-a-better-way-to-save-streamed-files-with-mediaplayer
I guess you could act as a proxy in a thread, handle your header and forward the rest to the media player? Or if you control the server pass the extra data in a different request...
Related
My app shows HLS streams, for showing I use ffmpeg player. A problem is that the player doesn't change streams url relatively to current bandwidth. So I implemented logic about calculating bandwidth, but I cannot find where I have to implement url changing. I figured out that read_data method responsibles for buffering data. In this case I have to change url before calling read_data but I cannot find from what place it calls.
So my question is
Where is that place where the player connects to server and starts buffering data?
Maybe someone faced with same problem. Or knows ffmpeg well and know this place where is better to place change url logic. Please let me know, I open for all proposals
read_data passed as callback to ffio_init_context, which is called multiple times in for cycle in hls_read_header hls.c#L1619
Use the top level m3u8 playlist URL, http://yourdomain/playlist.m3u8
And your top level playlist will look like this, includes 3 stream urls,
#EXTM3U
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=200000
/hls/stream1.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=500000
/hls/stream2.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=800000
/hls/stream3.m3u8
There are 3 streams that depend on bandwidth,
just give the top level url to player
Look at here for more information.
I have a video file that is encoded. For example the first bit of each byte is reversed. I want to read this video file, change the first bits and send the decoded result to Mediaplayer.
How can I do that? How can I create and pass this stream to media player without saving the decoded data on storage?
It is important that I do not want to save a decoded copy of my video and play it on media player. I want to play encoded video directly on mediaplayer using streams or other possible ways.
Short answer: NO, there is no way to do that (obviously by my point of view)
You cannot reproduce from a "custom" stream by manipulating the data just before passing it the MediaPlayer.
Why?
The official MediaPlayer API which is closest to the one needed to achieve your goal is the following:
MediaPlayer mp = new MediaPlayer();
FileInputStream fis = new FileInputStream(yourFile);
mp.setDataSource(fis.getFD());
//...
This snippet allows to play a file starting from a FileInputStream, but more precisely from the underlying FileDescriptor. The FileDescriptor is a class which is marked as final (and it is reasonable because it has to deal with the underlying OS), so you cannot override anything.
Possible workarounds?
As you already pointed out, you can try to modify the real file "in-place" while reproducing the video with the standard MediaPlayer (without creating a deep/separate copy of it): it's very tricky but plausible.
Try to use another player object: ExoPlayer (which is a new standard Android API) or Vitamio
Try a pure native solution (NDK + Android source), which I will not recommend ;)
UPDATE: detail about the 1st workaround
Assuming that "the first bit of each byte is reversed" you can use a FileChannel to manipulate the whole file "in-place" while reading it. You should use a FileChannels created from a RandomAccessFile created in mode "rw" in order to be able to read/write simultaneously.
This pre-elabaration task can run on a separated thread (or inside an IntentService, which is more fashion and reliable); you can wait for few seconds after the elaboration begins and then starting the playback by passing the File reference to the standard MediaPlayer (you need to tune this waiting period considering how fast is the elaboration, like a streaming buffering but easier because performance are almost stable).
In this way you don't need to wait the end of the pre-elaboration before starting the playback.
When the playback stops or you close the app, you need to undo your work by calling the same pre-elaboration task on the played file in order to restore it to its original state.
I hope that this hint can be useful.
Comments and precisations about my answer are welcome, I will update my post if I'll find more information.
I am trying to stream incoming AMR_NB. I can't use MediaPlayer directly because it requires a seekable file. I would like to use MediaCodec, but to use MediaCodec I need (I think... please correct me!) MediaExtractor to give me things like the presentationTime. Is that true? Can I use MediaCodec without MediaExtractor?
MediaExtractor seems to require seekable files. The documentation only specifically says so for one of the setDataSource operations but when I tried to use any of the others it failed due to failed seek attempts.
So, what can I do to get my incoming AMR stream to play? I am aware of a scheme where by you save incoming data to a file and periodically make a copy of that file to feed to MediaPlayer but I'd really prefer to find a real honest streaming solution.
Is it possible to use MediaCodec without using MediaExtractor? If so how do I find presentation time and the string to pass to MediaCodec.createDecoderByType? The documentation SAYS that "audio/3gpp" is what I want but when I attempt to use that I get the following error:
codec = MediaCodec.createDecoderByType("audio/3gpp");
01-02 03:59:36.980: E/OMXMaster(21605): A component of name 'OMX.qcom.audio.decoder.aac' already exists, ignoring this one.
So I'm not sure how to get at MediaCodec either.
"I can't use MediaPlayer directly because it requires a seekable file" This is not generally true. I would like you to try it on your stream and report exactly what happens.
"Can I use MediaCodec without MediaExtractor?" I doubt it: I believe they are designed to be used together.
I have used these components to play streams. However, the MediaExtractor has limitations that are not documented ( as far as I know ). So use a little proxy server to feed it things it can digest. And I have 1 thread to run the MediaExtractor and another to take output from the the MediaCodec. Then i have to avoid deadlocks and cope with snchronization. But it is not that bad provided you just want to play forwards only. Then you have only the problem of how to stop!
I advise that you try MediaPlayer first. Otherwise, if you are keen enough to try the MediaExtractor, we could share our discoveries about what it will and wont digest. Don't take anything for granted. For example it seems it will play my MP3 files, but cannot discover their duration, or seek on them!
Is it possible to get track name while playing radio stream via MediaPlayer?
I would say pretty much with certainty - no, it isn't possible.
I can't see any MediaPlayer methods which suggest it's possible plus the way that metadata such as track name etc is presented in streaming media, will depend on the source, e.g, Shoutcast or otherwise.
If it can be done I'd be interested to know but I'd suspect you'd need to write something like a Shoutcast client (or other client depending on source). You'd still use MediaPlayer for streaming but would need extra code for accessing the metadata.
I'm looking for some way in Android to play in-memory audio in a manner analogous to the waveOutOpen family of methods in Windows programming.
The waveOut... methods essentially let an application create arrays of sample values (like in-memory WAV files without the headers) and dump them into a queue for sequential playback. Windows transitions seamlessly from one array to the next, so as long as the application keeps dumping arrays into the queue ahead of playback, the program can create and play continuous audio of any arbitrary length. The Windows API also incorporates a callback mechanism that the application can use to indicate progress and load additional buffers.
As far as I can tell, the Android audio API lets an application play a file from local storage or a URL, or from a memory stream. Is there any way to get Android to "queue up" MediaPlayer.start() calls so that one player transitions (without glitches) into the next upon play completion? It appears that Jet does something like this, but only with its own internal synthesis engine.
Is there any other way of accessing Android audio in a waveOutOpen way?
android.media.AudioTrack
... is the class you are probably looking for.
http://developer.android.com/reference/android/media/AudioTrack.html#AudioTrack%28int,%20int,%20int,%20int,%20int,%20int%29
After creating it you simply feed it with binary data with given format using following method:
AudioTrack.writeTo(...)