Move "moov" atom to front of video natively in Android - android

I've noticed that the moov atom is placed at the end for many .mp4 videos. I'd like to relocate this atom to the front of the video so that I can enable progressive downloading. Is there a way to do this programmatically in Android without using any external libraries? I don't want to also include these binaries in the app. Thanks!

I was working on the same problem. I found this library: https://github.com/ypresto/qtfaststart-java
It is super nice to use. Here is the example:
try {
QtFastStart.fastStart(fileIn, fileOut);
} catch (IOException e) {
// Handle
} catch (QtFastStart.MalformedFileException e) {
// Handle
} catch (QtFastStart.UnsupportedFileException e) {
// Handle
}
compile 'net.ypresto.qtfaststartjava:qtfaststart:0.1.0'

Related

video compression using silicompressor in android not working

i am trying to compress videos in project therefore using silicompressor. but when i pass it the destination path my application gets and hang and does nothing. But it does create a folder in my storage and stores a video file but when i try to play it, it gives error "Failed to play video". And this file has size of 24 bytes. so take a look and tell what i have done wrong.
Here is my code.
File destinationPath = new File("/storage/emulated/0/DCIM/Camera/myvideo");
destinationPath.mkdir();
File file = new File(destinationPath.getAbsolutePath());
Toast.makeText(Post.this, "folder: " + file, Toast.LENGTH_SHORT).show();
try {
filePath = SiliCompressor.with(Post.this).compressVideo(videouri, file.toString());
video.setVideoURI(Uri.parse(filePath));
Toast.makeText(Post.this, "Completed", Toast.LENGTH_SHORT).show();
} catch (URISyntaxException e) {
Log.d("EXCEPTION", e.toString());
Toast.makeText(Post.this, e.getMessage(), Toast.LENGTH_SHORT).show();
e.printStackTrace();
}
Try running compression code using AsyncTask
Here you can find demo app code for video compression.
I tried this SiliCompressor for video compressor, which is very well, and in this version, Audio and Video work well but not maintaining the resolution.
https://github.com/Tourenathan-G5organisation/SiliCompressor/tree/v2.2.2
Note: In the latest version(2.2.3) of SiliCompressor audio is not working after video compressor
OR
I made this and working well
https://github.com/iamkdblue/CompressVideo
I hope it helps you.

Reading a video frame by frame with Android OpenCv

I have a set of videos stored in a folder on the android file system.
I would like to read each frame by frame so that i can perform some OpenCv functions on them and then display them in a Bitmap.
I'm not sure how to do this correctly, any help would be appreciated.
You can take a look at Javacv.
"JavaCV first provides wrappers to commonly used libraries by researchers in the field of computer vision: OpenCV, FFmpeg, libdc1394, PGR FlyCapture, OpenKinect, videoInput, and ARToolKitPlus"
To read each frame by frame you'd have to do something like below
FrameGrabber videoGrabber = new FFmpegFrameGrabber(videoFilePath);
try
{
videoGrabber.setFormat("video format goes here");//mp4 for example
videoGrabber.start();
} catch (com.googlecode.javacv.FrameGrabber.Exception e)
{
Log.e("javacv", "Failed to start grabber" + e);
return -1;
}
Frame vFrame = null;
do
{
try
{
vFrame = videoGrabber.grabFrame();
if(vFrame != null)
//do your magic here
} catch (com.googlecode.javacv.FrameGrabber.Exception e)
{
Log.e("javacv", "video grabFrame failed: "+ e);
}
}while(vFrame != null);
try
{
videoGrabber.stop();
}catch (com.googlecode.javacv.FrameGrabber.Exception e)
{
Log.e("javacv", "failed to stop video grabber", e);
return -1;
}
Hope that helps. Goodluck
i know it's to late but any one can use it if he need it
so you can use #Pawan Kumar code and you need to add read permession to your manifest file <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
and you will get it working.
I donĀ“t know about Android but generally you would have to use VideoCapture::open to open your video and then use VideoCapture::grab to get the next frame. See the Documentation of OpenCV for more information on this.
Update:
It seems like camera access is not officially supported for Android at the moment, see this issue on the OpenCV Github: https://github.com/opencv/opencv/issues/11952
You can either try the unofficial branch linked in the issue: https://github.com/komakai/opencv/tree/android-ndk-camera
or use another library to read in the frames and then create an OpenCV image from the data buffer like in this question.

Play m3u8 audio file using android media player

I am developing an android application for android OS > 4.0 (including and post OS). I have a sample m3u8 file as follows :
#EXTM3U
#EXT-X-TARGETDURATION:56
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:28, no desc
ulr/audio/file.mp3
#EXTINF:28, no desc
ulr/audio/file.mp3
#EXT-X-ENDLIST
and I am trying to play that file, using the following code
mMediaPlayer = new MediaPlayer();
mMediaPlayer.setOnErrorListener(this);
mMediaPlayer.setOnPreparedListener(this);
mMediaPlayer.setOnCompletionListener(this);
mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
try
{
mMediaPlayer.setDataSource(uri);
} catch (IllegalArgumentException e)
{
e.printStackTrace();
} catch (SecurityException e)
{
e.printStackTrace();
} catch (IllegalStateException e)
{
e.printStackTrace();
} catch (IOException e)
{
e.printStackTrace();
}
mMediaPlayer.prepareAsync();
and my onPrepared() method is as follows :
public void onPrepared(MediaPlayer player)
{
player.start();
}
But the code first comes to onPrepared() and then immediately goes to onError(), with the what=1 and the extra=-1010.
I know this question has been asked various times (here, here and here for instance) and I also know about Vitamio, but i want to find out what is wrong with my implementation. Is there something wrong with the m3u8 file that I created? I went through its documentation and everything seems correct.
Would be really glad if someone could throw some light in this matter.
Error code -1010 matches up with MEDIA_ERROR_UNSUPPORTED which would imply that the device does not have the hardware or software codecs it needs to decode the MP3 files in your playlist.
Vitamio would work in this situation because it adds software decoding for the media. This is slower than hardware decoding and uses more battery. It can also increase your app size significantly.
This seems odd, though, since MP3 has been a supported media format for decoding in Android for a very long time.

How to Load the KMZ file?

"I have to load a kmz file in to the android application that i am developing and that kmz file will be loaded from sdcard in to the application. So what i should do for that whether there is direct uri intent or i have to parse it by xml parsing if so then how to load coordinates in to the map to show that kmz file.
To use the native v3 KmlLayer, which supports kmz files, subject to the documented size and complexity restrictions, the kmz file must be publicly available on the web (so google's servers can get to it).
To use a local file (which it sounds like you do), your only option is to use a third party parser like geoxml3 or geoxml-v3.
to open in android studio use below code :
try {
KmlLayer layer = new KmlLayer(mGoogleMap,R.raw.filename,
getApplicationContext());
// creating the kml layer, put the file in res/raw
layer.addLayerToMap();
} catch (XmlPullParserException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}

Android Streaming Video using MediaPlayer

I'm back with another problem!
I'm trying to create an app that would list a selected Livestreams, from Own3d.TV, Justin.Tv etc...
If my research isn't totally failed, I can use the MediaPlayer object to Stream video, the only question is how do I use it?
So far my code looks like this, but it's giving me an Exception when trying to prepare the MediaPlayer.
public class Media extends Activity {
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
SurfaceView sw = new SurfaceView(this);
SurfaceHolder sh = sw.getHolder();
setContentView(sw);
Uri ur = Uri.parse("http://www.twitch.tv/widgets/live_embed_player.swf?channel=hashe");
MediaPlayer mp = new MediaPlayer();
mp.setDisplay(sh);
try {
mp.setDataSource("http://www.twitch.tv/widgets/live_embed_player.swf?channel=hashe");
mp.prepare();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
mp.start();
//mp.setDisplay(sw);
}
}
Is it even possible to Stream the video from these sites using the MediaPlayer?
If not, how shoud I approach this problem?
Thanks!
when implementing mediaPlayer, try something like this, it worked fine for me: http://android-er.blogspot.com/2010/11/play-3gp-video-file-using-mediaplayer.html
but I think, u cant use source like u have written in your code. You have to use some URL from where is video streamed directly with progresive streaming, not URL of html site with player embeded. You can recognize: when u write useful URL with streamed video in your internet browser, browser starts to download this streamed video in your computer. And I think, MediaPlayer supports in core .3gp or .mp4 format only..
I hope u understand my bad english
Use ExoPlayer instead of MediaPlayer. See Android official documentation:
ExoPlayer supports features like Dynamic adaptive streaming over HTTP (DASH), SmoothStreaming and Common Encryption, which are not supported by MediaPlayer. It's designed to be easy to customize and extend.
ExoPlayer - Android Developers

Categories

Resources