I've been exploring the documentation and examples at http://bigflake.com/mediacodec/ by Fadden, and applied patch http://bigflake.com/mediacodec/0001-Record-game-into-.mp4.patch to the breakout game. Unfortunately, after compiling the code, I realized it doesn't work, producing video files that aren't streamable.
I see the following error:
"The mp4 file will not be streamable."
According to Fadden, this should be fixed by checking the mBufferInfo.flags (https://stackoverflow.com/questions/23934087/non-streamable-video-file-created-with-mediamuxer), which is already done in his code, so I'm at a complete loss. Did anyone else get the video recording patch to work?
The warning you're seeing is just a warning, nothing more. MP4 files aren't streamable anyway in most cases, in the sense that you would be able to pass the written MP4 over a pipe and have the other end play it back (unless you resort to a lot of extra trickery, or use fragmented MP4 which the android MP4 muxer doesn't write normally). What streamable means here is that once you have the final MP4 file, you can start playing it back without having to seek to the end of the file (which playback over HTTP can do e.g. with HTTP byte range requests).
To write a streamable MP4, the muxer tries to guess how large your file will be, and reserves a correspondingly large area at the start of the file to write the file index to. If the file turns out to be larger so the index doesn't fit into the reserved area, it needs to be written at the end of the file. See lines 506-519 in https://android.googlesource.com/platform/frameworks/av/+/lollipop-release/media/libstagefright/MPEG4Writer.cpp for more info about this guess. Basically the guess seems to boil down to: "The default MAX _MOOV_BOX_SIZE value is based on about 3 minute video recording with a bit rate about 3 Mbps, because statistics also show that most of the video captured are going to be less than 3 minutes."
If you want to turn such a non-streamable MP4 file into a streamable one, you can use the qt-faststart tool from libav/ffmpeg, which just reorders the blocks in the file.
You can check Intel INDE Media for Mobile, it allows to make game capturing and streaming to network:
https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials
simplest capturing:
https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials-video-capturing-for-opengl-applications
youtube streaming:
https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials-video-streaming-from-device-to-youtube
Related
Currently, I have been making streaming video player app. so for that reason I want to use dash streaming. I have a normal URI of video from my firebase storage. but for dash streaming, I think I need a file that ends with .mpd.
ExoPlayer player = new ExoPlayer.Builder(context).build();
player.setMediaItem(MediaItem.fromUri(**dashUri**));
player.prepare();
what I have to do to convert normal to URI which ends with .mpd.
So, how can I do that?
You actually have to covert the video file to a fragmented format and typically will want to make it available in multiple bit rates, which means transcoding it also.
The reason for this is that DASH is an ABR protocol - it breaks multiple renditions of a video into equal size, time wise, chunks and the player can then request chunk by chunk, choosing the best bit rate version of each chunk depending on the current network conditions and the device type.
See here for more info: https://stackoverflow.com/a/42365034/334402
Open source tools exist to create DASH files from mp4 - see some examples here (links correct at time of writing):
https://github.com/gpac/gpac/wiki/DASH-Support-in-MP4Box
https://www.ffmpeg.org/ffmpeg-formats.html#dash-2
My android app plays videos in Exoplayer 2, and now I'd like to play a video backwards.
I searched around a lot and found only the idea to convert it to a gif and this from WeiChungChang.
Is there any more straight-forward solution? Another player or a library that implements this for me is probably too much to ask, but converting it to a reverse gif gave me a lot of memory problems and I don't know what to do with the WeiChungChang idea. Playing only mp4 in reverse would be enough tho.
Videos are frequently encoded such that the encoding for a given frame is dependent on one or more frames before it, and also sometimes dependent on one or more frames after it also.
In other words to create the frame correctly you may need to refer to one or more previous and one or more subsequent frames.
This allows a video encoder reduce file or transmission size by encoding fully the information for every reference frame, sometimes called I frames, but for the frames before and/or after the reference frames only storing the delta to the reference frames.
Playing a video backwards is not a common player function and the player would typically have to decode the video as usual (i.e. forwards) to get the frames and then play them in the reverse order.
You could extend ExoPlayer to do this yourself but it may be easier to manipulate the video on the server side if possible first - there exist tools which will reverse a video and then your players will be able to play it as normal, for example https://www.videoreverser.com, https://www.kapwing.com/tools/reverse-video etc
If you need to reverse it on the device for your use case, then you could use ffmpeg on the device to achieve this - see an example ffmpeg command to do this here:
https://video.stackexchange.com/a/17739
If you are using ffmpeg it is generally easiest to use via a wrapper on Android such as this one, which will also allow you test the command before you add it to your app:
https://github.com/WritingMinds/ffmpeg-android-java
Note that video manipulation is time and processor hungry so this may be slow and consume more battery than you want on your mobile device if the video is long.
I'm playing mp3 file streamed from the network in my application, some mp3 files has weird behavior: mediaPlayer.getCurrentPosition() is larger than mediaPlayer.getDuration() at the end, for about 3 seconds.
The mp3 files are CBR encoded.
What might be the reason of this?
Finally solved the problem by converting the mp3 files, this is the command I'm using:
lame --mp3input -t -m s -b 128 --cbr input.mp3 output.mp3
There is a few reasons you can get this behavior.
First it appears that people had better results using mp3 files at exactly 44100Hz, because apparently the MediaPlayer class is assuming this value and scale the time accordingly, making strange values for files not using this sampling.
You also need to check the mode of your channels, and try using Joint Stereo or forced L/R Stereo. Joint should be the default, but your files might have been previously bad encoded, so it's worth trying. It's interesting to note that Forced L/R Stereo might loose quality for the same bitrate as Joint.
It would also be useful to check the output of soxi which is part of the sox package (you can also do it with ffmpeg), that will give you the number of channels, Sample rate, Bit Rate and Number of Channels.
Also you might want to check the raw content of the mp3 file if you did some treatment on them using any app for the presence of garbage xml content that might have been inserted during the export.
If you have the possibility to modify the mp3 files you're streaming, (which sounds like you do since you can tell the bitrate) these are what I would try first. If it's more like user-upload kind of stuff, maybe you should have a look to another solution instead, like ExoPlayer which has a few thousands stars and active development. It wraps the MediaPlayer api still, but worth a try.
You also have to consider that it might be a threading problem, where the player would stop playing, but the timer would actually keep going, giving you this result where it's superior to the actual duration of the song. 3 seconds seems a bit too much to explain it by that, but that's just a thought.
I've set up Apache 2.0 with several .m3u8 files serving a set of mpeg2ts files over HLS. These ts files were produced with libavformat by transmuxing an MP4 I downloaded from youtube. When I play the resulting HLS on VLC or QT, everything works fine. But on Android (Stagefright 1.2) the video has several problems:
The option to go full-screen does not work
The video duration says 1:40 when it is actually 2:00
The video sometimes fails to start and you have to reload page
The video reliably distorts (tears and pixelates) at transition points when switching the underlying .ts streams.
Some of this is ameliorated if I don't use HTML5's tag. But problem #4 remains.
I can play other m3u8's on Stagefright without any of the above problems, so I am assuming my transmuxing code is wrong, but even forgoing it and using the (recently added) HLS segmenting features of ffmpeg I have the same problem. Recoding with libx264 changes nothing.
I am at wit's end debugging this.
Android's libstagefright (along with mediaservice's NuPlayer) is not so mature product as vlc and a lot of troubles which are not present while using vlc are present in android it is much more vulnerable for any broken, corrupted, deviated content.
Such pixelation/macroblock artifacts are usually present while some frames where dropped (by android code or were lost) before decoding.
If those corruptions are present along with some green fields it might be a problem with synchronization of format change with a key frames (which might be a result of wrong implementation of source, or in part which notifies ANativeWindow about format change).
In corner case you might not get any green frames but crop/reolution would be deviated and pixelation might be visible).
What I would do:
1) Check for frame dropps
2) Check with some analyzer frames at the borders of consecutive sections
I wonder what is the best option to store a single picture and short voice memo in one file? That needs to be openable by mobile phones in browser (iOS, Android) and preferably be shown as a single full screen photo and sound playing in background.
Effectively i'm looking for a most size efficient combination of something like MP3 + JPG.
If i do it in a single .mov i guess i loose a lot of space due to compression of each and the same frame 24 frames/second.
A rough list of options which comes to mind:
.mov
Mpeg4
H.264
QuickTime
HTML5 video format (Theora?)
store it in Flash (but this is not supported by iOS)
EDIT1:
tried storing h.264 in .mp4 container, files are small enough (around 1Mb), but somehow it does not work on an Android phone of my friend. Probably i need more testing, but it seems Android OS does not like proprietary codecs...
My most intuitive solution for this would be to store a JPEG and an MP3 separately on the server. To download one entity as a single unit, download a bit of JSON or XML data that contains pointers to the picture and the audio file.
If you are set on having one file do the job, you might try embedding the JPEG inside the ID3 metadata of an MP3 file (this type of metadata functionality exists to, e.g., store album art with a music file). You would want to make sure that the ID3 tag is near the start of the file. JavaScript within the mobile browser could fetch the file, a third party library could do the ID3 parsing (some Googling reveals that such libraries exist; don't know if they all support fetching a JPEG image). Then the JS would need to be able to feed the file into an audio tag for playback, which I'm not sure is possible.
Another thing to experiment with is a .MP4 which encodes the audio track along with a single video frame with a really, reeeaaallly long duration. You would have to experiment to determine if the mobile browsers handle that gracefully while also allowing smooth audio seeking. If they don't, then perhaps re-encode the frame with every 1-5 seconds to keep the bitrate minimal.