Unity play video on Android - android

How do you make unity can run video through android ?
Remember that unity video can not run into android.
Or if there any asset package that can make unity video can run through Android, What is the asset package name ? Where i can find it ? Preferred is free. And How to use it ?
Thanks

On most devices, you can use MovieTexture, but according to the docs,
Movie Textures are not supported on Android. Instead, full-screen streaming playback is provided using Handheld.PlayFullScreenMovie.
Here is the documentation this method.
If you have a budget, you can try Easy Movie Texture, currently priced at $65. I haven't used it myself, but it shows a video demo running on an Android device and the reviews all seem positive.
Hope that helps!

You can use Handheld to play video:
string url; //Contains the path to your video
Handheld.PlayFullScreenMovie(url, Color.black, FullScreenMovieControlMode.CancelOnInput, FullScreenMovieScalingMode.Fill);
In order to get more info, check this

Related

Build light weight FFMPEG lib for android

I need to make movie (MP4) using sequence of bitmaps. I am using Jcodec it's working fine but if i share the video which one is builded using Jcodec, it's not supported by whatsapp. So i plan to make the movie from ffmpeg (Prebuilt libraries https://github.com/tanersener/mobile-ffmpeg) it's working fine. But my universal apk size was increased up to (50mb). I just need to make movie from sequance of images so i don't need any external libraries and audio features also. So please help me someone to build the ffmpeg for only convert the sequance images to video only.
Thank you.
You can remove some packages like an opus.
probably you don't need all future.
you should change this file

cordova-plugin-media: Parse ".amr" Audio File on nodejs server

I'm using the cordova-plugin-media plugin to record audio-files from android and ios devices.
However, android only allows to record the file in ".amr" ending, iOS on the otherside only supports ".wav".
Playing the ".wav" from the iOS device on Android works, however, iOS doesn't support ".amr" files. That's why I have to convert them somehow.
Since I couldn't find any cordova-plugin converting the ".amr" file on the clientside besides this one (which is based on an external API and extreeeemly slow + not fully working - in addition that I'm not a fan of doing file-conversions on the client-side), I'm looking for a solution on the server-side:
Is there any javascript-library (best if it's "nodejs-friendly") allowing me to easily convert an ".amr" file to a ".wav" or ".mp3" (or similiar - just playable on iOS)?
Despite ffmpeg (which I couldn't manage to install properly), I couldn't find ANY solutions... :(
(setting the mime-type to 'audio/wav' in the cordova-plugin-media creates a "corrupt" wav file, still amr-encode when analyzing it further with a tool...)
I really appreciate your help!
I came up with a "solution", which I share with you if someone else is running in the same problems as I did:
www.cloudconvert.com offers a very simple api for "on-the-fly" converting video/audio/img files.
For node.js there is a node package for that I can recommend: https://github.com/cloudconvert/cloudconvert-node
I decided to convert the .amr to .mp3 and not .wav (iOS "standard") since .mp3 is smaller. To be able to play it on an iOS device though one has to adjust the bitrate a little bit from the (manual) example described on github.
Make sure to pass the following options to your converting process:
ccprocess.start({
outputformat: 'mp3',
input: 'download',
file: 'path-to-your-file',
converteroptions: {
audio_bitrate: "721",
audio_frequency: "44100",
audio_qscale: -1
}
}, function (err, ccprocess) { ...

How to remultiplex video in Android

I have the mov file url which I have to play using videoview. But android does not support that as per http://developer.android.com/guide/appendix/media-formats.html
So is there any way to play mov ulr video using video view or remultiplex (or re-encode, depending on the source) into an something that Android plays nice with, e.g. an mp4 container?
You can use ffmpeg to convert the file on the device if necessary, but integrating ffmpeg is not trivial and converting a video on the device is compute intensive so will take time and use up your battery. If it is possible to change the format server side, it is generally much easier.
If you do want to use ffmpeg, a wrapper approach my be useful - this project provides an example: https://github.com/jhotovy/android-ffmpeg. I have used a similar approach and it works fine.
Make sure in particular you note the comments about calling ffmpeg twice.
There are players in Google play that will play mov files, but not everyone has had good experience with them (for example: https://stackoverflow.com/a/27006587/334402). VLC in particular is a common choice and projects exist to integrate it into Android apps (although maybe not via web views) e.g.: https://github.com/mrmaffen/vlc-android-sdk

How to create a stagefright plugin

I have a task which involves integration of a video decoder into Stagefright(Android's multimedia framework). I searched and found the following about creating a new plugin for Stagefright:
To add support for a new format, you need to:
Develop a new Extractor class, if the container is not supported yet.
Develop a new Decoder class, that implements the interface needed by the StageFright core to read the data.
Associate the mime-type of the files to read to your new Decoder in the OMXCodec.cpp file, in the kDecoderInfo array.
static const CodecInfo kDecoderInfo[] = {
{MEDIA_MIMETYPE_AUDIO_AAC, "OMX.TI.AAC.decode"},
{MEDIA_MIMETYPE_AUDIO_AAC, "AACDecoder"},
};
The above data is all i could find out on net. Right now i have a simple app that will take a file as an input and render it on the screen using native API's in android. Can anyone please tell me how to proceed further. And from where does all these OMXCodec.cpp and others come into picture and which directory of my project should i have them in. Please provide solutions regarding the same. Thanks in advance.
From your question, it appears that you are looking at a recommendation which is specific for Ice-Cream Sandwich or earlier versions of Android. The first thing you should be clear about is the version of the android i.e. Ice-Cream Sandwich or before or JellyBean and after. The integration of codecs is different across different releases of Android.
I have already commented on your other question which is specific for JellyBean and later (Reference: Android: How to integrate a decoder to multimedia framework)
If you would like to integrate your codec in Ice-Cream Sandwich or before, the steps are already available in your question. In addition to adding the decoder into kDecoderInfo list, you may like to setup certain quirks as shown here.
For the question on OMXCodec.cpp, you can find this file at
frameworks/base/media/libstagefright/ in case of Ice-Cream Sandwich and frameworks/av/media/libstagefright/ in case of JellyBean.
If you have followed all the steps to integrate the video decoder into the Stagefright framework, then the easiest test would be to perform the following:
Copy a media file into SD-Card
In OMXCodec.cpp, enable logs by removing the comment in this statement //#define LOG_NDEBUG 0 and run a mm in the directory. Copy the rebuilt libstagefright.so to /system/lib on your device.
Enable logcat and start capturing logs.
Goto gallery, select your file and allow the standard player to play your file.
Check your log file if the player has selected your OMX component by searching for your component name. If found, your integration of codec into Stagefright is successful. Else, you will have to debug and find out what is the problem.
Postscript:
Based on your queries, I presume you aren't familiar with Android sources. Please refer to androidxref site to become familiar with AOSP distributions.
Unless you are planning to support a new media file-format, you will not require to support Extractor class. MediaExtractor abstracts a file-format parser and helps to de-multiplex the different tracks in a media file.
I hope with this information, you should be able to get your codec integrated and functional in Android.

Making Video files Android ready for playback over HTTP

I am working on an Android application which is supposed to play videos over HTTP on Android devices. Before we setup a server to host the video files just wanted a few things clarified:
As per the developer documentation, Android supports .mp4 and .3gp container formats for video. If we use H.263(video) - AAC LC (Audio) audio-video codec used for our media files will we be able to play the video by passing the URL to MediaPlayer class?
I did a little experiment and passed URL of one of the video files(.mp4) to the MediaPlayer class and got the following error:
Command PLAYER_INIT completed with an
error or info
PVMFErrContentInvalidForProgressivePlayback
From the docs, I came to know that for progressive playback, the video's index (e.g moov atom) should be at the start of the file.
Questions:
1. How do we make our videos Android-ready?
2. What are the different considerations that we need to make?
Please help.
Thanks.
You can actually achieve this using a pure Java implementation of ISO BMF ( MP4 ) container used JCodec ( http://jcodec.org ). For this use the following code:
MovieBox movie = MP4Util.createRefMovie(new File("bad.mp4"));
new Flattern().flattern(movie, new File("good.mp4"));
The side effect of 'Flattern' is creating a web optimized movie file that has it's header BEFORE the data.
You can also use similar functionality from command line:
java -cp jcodec-0.1.3-uberjar.jar org.jcodec.movtool.WebOptimize <movie>
The JCodec library can be downloaded from a project website.
I cross posted this question on Android-developers google group. Mark answered it there. Thanks Mark!
See this thread

Categories

Resources