I know that there are list of some supported format for Image, Video and Audio that an Android device play or show.
Now, I am creating an application for Android, iOS, Windows and Blackberry. There are list of supported format that each platform can either play(for Audio and Video) or show (for images). They will sync this media on Cloud. My need is that I should be able to show each and every media that what ever there extension is on my all platforms. So for that I need following information.
Is there any single API available (open source or paid what ever it is) which can convert these format in supportable format on my device? I googled and get to know of some APIs but I am not sure how effective they are so if any one of you have tried and know which one is best, please refer me that.
Also, I don't want to replicate each and every video and produce its supported version on my cloud. As I have purchased that space and don't want to waste this space just for replication of same thing. So I want that whenever a un-supported format came, It should be translated to supported version for that platform on the fly.
Please provide your suggestion.
If you create an Adobe AIR app (by using either Adobe Flash Builder or Adobe Flash), then it will run on Android, iOS and BlackBerry 10 (and desktop - Windows and Mac) and support same audio and video formats.
Here a nice tutorial for all platforms.
Related
I'm new to flash so I will try to write it down as clear as i can. I hope you can help me with this and thanks in advance.
I have a project that function like a brochure. Its just a compilation images, text and several videos with a very simple goto pages functions.
the images and text are already included in the fla file while the videos are in my project file directory and streamed from my local hard drive. I'm using the earlier version of flash video component and I'm using flash CS6.
My AS3 Script to load the videos
sp.source = SPContent;
sp.setSize(340, 335);
var VidBtn01 = MovieClip(sp.content).VidBtn01;
var VidBtn02 = MovieClip(sp.content).VidBtn02;
VidBtn01.addEventListener(MouseEvent.CLICK, onClick01);
VidBtn02.addEventListener(MouseEvent.CLICK, onClick02);
UNOVid.addEventListener(Event.REMOVED_FROM_STAGE, stopMovie);
function stopMovie(e:Event=null)
{
UNOVid.stop();
}
function onClick01(evt:MouseEvent):void {
UNOVid.source = "Video1.flv";
}
VidBtn01.addEventListener(MouseEvent.CLICK, onClick01);
function onClick02(evt:MouseEvent):void {
UNOVid.source = "Video2.flv";
}
VidBtn02.addEventListener(MouseEvent.CLICK, onClick02);
When test published in flash professional, the projects works really well. When Publishing for android and iOS, flash do not reveal any errors. Upon
testing the published apk and api files, then the problem arise.
Problem 1 Android: All buttons and navigation working, flash video component present but some videos canot be played.
Problem 2 iOS: All buttons and navigation working, videos not playing, cant really tell because flash video component is not being displayed.
All videos are flv 320x240 but of different duration and encoded using flash video encoder same settings. During packaging files included are the
projects swf, the flash video component's swf, the app.xml and all the videos included in the project. The .apk and .api file size is 1.02GB. I also tried
publishing without the videos just to see the file size and both the .apk and .api file is only 3.9mb. Also check the videos folder size in windows
properties, size is 1gb plus.
I checked the installed app on my iOS device using an app called iFile from cydia, all the videos are accounted for with the flash component.
For android upon checking the installed file it revealed and apk file and some .so file. You cant really see if the files are all in there. So I did a little
research and learned that if you change the .apk extension to .rar, you can extract the file in windows to inspect whats inside your published .apk
file. Upon doing sa I was able to see that all the files are accounted for.
I'm using for device testing a generic android tablet running 4.01 IceCreamSandwich and an iPad Air iOS7 jail broken.
My Question for ANDROID
How does android process the apk file, why was the apk files not extracted when installed on my device?
What might be the reason why some of the videos cannot be played? Hardware Limitation perhaps, any idea?
Will there be a conflict with the New Versions of Android (jellybean/Kitkat) since adobe flash already stopped supporting android?
My Question for iOS
Why was the Flash Video Component missing? Is this a compatibility issue with iOS since the flash video component is an swf file?
I assumed that packaging it would eliminate that issue considering that publishing air for ios includes adobe air in the package. If that's the case isthere another way to deliver the video on stage in iOS?
Is FLV video compatible with iOS if not what video compression should i use?
Lastly how can i publish to Android and iOS without the need to package it with the video. So the final .apk and .api file size is manageable.
Perhaps just adding the videos in a directory within he device's storage using the resolve path method. Can anyone share how i can do this?
Again thanks in advance to all.
I don't have time for a complete answer right now. Here's a partial answer:
You can't use .flv on iOS. You have to encode using the H.264 codec in an .mp4 file. Apple has more-or-less particular specs on the encoding. Look 'em up!
Also, in iOS you'll want to use the StageVideo class combined with the Netstream and NetConnection classes. StageVideo is a requirement for the GPU video processing on iOS devices. BUT when you're running your video code on a desktop machine, or when you're running video in AIR on desktop, you want to use the regular Video Class because you'll be rendering, usually, with the CPU. So, usually, your code will have to test for the availability of GPU processing and then invoke one function for implementing THAT, or another function if only CPU processing is available.
Video (without components -- which you should forget!) sounds complicated, but it's not THAT complicated. It just takes a little time to learn. Suggestion: study here - http://gotoandlearn.com/play.php?id=46 for the basics of the Video Class. Then read the Adobe AS3 reference on StageVideo. Then come back here and ask more questions. You'll beat your head against a wall for a day or so, but then you'll really know how to handle video on, at least some mobile devices.
When you're testing on iOS don't forget to publish with GPU rendering.
I'm just learning AIR for Android myself so can't help you there.
I am developing a web app based on Phonegap with the feature of recording and uploading images and videos. Since videos tend to be quite large in standard resolutions, this feature is only helpful, when I am able to manage to scale down the file size (either at time of recording or afterwards).
On iOS 6 I have managed to do this by using the newly introduced <input type="file">. Thus the recorded video is automatically compressed, so that a 30 sec. video gets about 3 MB of file size, which is quite okay.
In Android this is not done (well, currently I could not even manage to get the input of type file working on Android 2.3). Since the size limitation in the Phonegap API is not supported by Android, and I did not find a plugin that is capable of doing this, I have no idea on how to solve this task.
Are there any ideas out there on how to do that?
<input type="file"/>
is not supported in the Android WebView. We are going to check in some preliminary support for this feature in 2.4.0. It won't do the compression like iOS though.
I believe you'd need to write a plugin to compress your video.
Is it possible to perform adaptive (multibitrate) streaming onto an Android device? If yes, how to do that?
If you have 4.0 or 3.2 you just use access the adaptive stream as you would any other video. Literally.
It's a HTTP access.
So if you use as a data source //mywebsite/video1.mp4 you woulduse as a data source the equivalent //mywebsite/video1.m3u8. Now, I'm not including any discussion on how you create your streaming file but only how you would access it.
All the magic happens within the client (ex: mediaplayer, videoview) supported on 4.0 and 3.2. For the record, you may be able to access and run streaming segments (.m3u8 files) on earlier versions of Android because the manufactures have sometimes played around with the code. But I haven't found any that actually adapt. They usually stick to the first segment they run or default to the lower bitrate segment in the bunch and stay there regardless of bitrate.
I'm writing an app for a multimedia website I frequent and I've run into a problem.
After grabbing a video stream URL (h264 wrapped in an mp4 container) and attempting to play it with the native video player, it fails.
This is on my Moto Droid running 2.2 (BB) and 2.3 (CM7). I've tested the app on my Xoom (3.1 stock) and it works great.
I've also had a friend test it on her Xperia Arc (2.3 stock as far as i know) and it worked for her. Makes me think it's a hardware decoder issue since I can play the stream fine using RockPlayer's software decoder but can't using the hardware one.
So I have three things here I want to find out:
Does the native Android player support software decoding. if so, how do I tell if it's using hardware or software and is it possible to toggle?
are there any 3rd part media players with readily available SDKs (free).
How can I just open the video in another app like Rock Player since I know it works. When I download a video using the browser, it asks me what video player I want it to use. How can I get this to pop up within my app and then send the video to it?
Yes, Android provides software h264 decoder, but it may not be available in 2.2. You can prefer software codec, see AOSP source code for stagefright:
usage: stagefright [options] [input_filename]
…
-s(oftware) prefer software codec
…
ffmpeg has many derivatives and wrappers on Android, which are available with a variety of license restrictions.
It's pretty easy to launch an Intent that targets a specific app. You can use setComponent() to match exactly the Activity you need. The better and more flexible way to deal with the problem is to create a custom Chooser (see e.g. Custom filtering of intent chooser based on installed Android package name), to let the user decide which player she/he prefers. With a custom chooser, you can decide to hide some of the handlers that are registered to the Action (e.g. not use the system player on Android version below 3.0).
I need to produce an audio file on an Android device that can be played on any other Android, iOS, or Windows Phone device. I need to do this without any third party apps. I am using .NET on the server, so I could convert there but I'd like to avoid that. I can see that you can record AAC in 2.3.3, but I need to use 2.2. Any ideas?
I believe the old .wav format should do the trick. It seems to be supported by both android and ios. I did not check wm7, but it is in wm6.5, so my guess is that it will be supported also:
http://developer.android.com/guide/appendix/media-formats.html
http://msdn.microsoft.com/en-us/library/cc907934.aspx
http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/MultimediaPG/UsingAudio/UsingAudio.html
Mp3, and wav are both supported on absolutly every system, including ios, android, and windows 7 mobile. mp3 is generally used for music, but for just about anything else, it is generally standard to use wav.