In my app I need to play videos from sdcard. Now it works fine on Galaxy S,Galaxy Tab2, But on some chinese tab Like "Giada" It is not working at all.
I have 4 different activities to play videos like. First activity plays a menu video which has navigation link to other activity. Problems I am facing.
First Video plays properly but looping failed and app closed.
If I navigate to other activity to play another video it says "Can't Play Video" and closed Some time it plays same video but not complete and closed app in between.
Video Extension: MP4
Resolution : 1024x600
Playing From : SDCard.
Target Tab Specification.
Resolution : 1024x600
Android :4.1
Tried with Video View and SurfaceView.
Help me out any help will be regreted.
The answer to this question will never be consistent across all devices or across all videos.
Whether a given video file will play in a given player depends on three things:
The video container format (file type).
The codecs the video (and potentially audio) streams are encoded with
Your player's support for that combination of container format and codec
The codec and player/device support for it is almost certainly the cause of the inconsistent results you've seen. (A codec, if you didn't know, is basically a repeatable mathematical formula that tells your system how to turn bits and bytes packed into a file into moving pictures(and back again, for that matter))
There are a large variety of video codecs in the video files floating around out there. Support for these codecs is wildly inconsistent just due to the history of video distribution. Many devices won't support streams encoded with certain codecs. There are a variety of reasons for this, but the most common are obscurity or licensing costs.
For example, up until a few years ago, almost everything was encoded in an .FLV container with an On2 VP6/VP7/VP8 codec. This is causing headaches today because while On2 owned these codecs, they kept a tight rein on the licenses. That didn't relax until .FLV had already begun to lose relevance, and so there is not a whole lot of (legitimate) software out there that can work with On2-encoded content.
What all of this means is that there is no silver bullet. All video will never run on all devices, at least not without the aid of video players that install and use their own codecs to decode the streams.
Needless to say, this does not include the libraries provided to you and your end users by the factory-installed Android libraries.
So, what do you do? Well, short of producing a video player that carries its own codecs, you can most effectively address the problem with a two-step approach:
Target specific devices that you want your application to work on
Encode your content to use use a video codec that works on all the devices you want to target. You may need to produce two copies of your video if you find that there is no codec that works across all devices you plan to support.
Today, the widest support is available with an MP4 container and a video stream encoded with the H.264 (AVC) codec. As I said, there is no silver bullet, and H.264 support is not universal by any means, but this one format will be playable more potential users than any other single choice you could make, due to its popularity and wide support in modern desktop and mobile environments.
Some tools you may find helpful:
MediaInfo will let you peek inside MPEG-flavored video containers to see what codecs are in use. This will be helpful in determining which devices are having trouble with which codecs.
FFmpeg is an encoding application that can convert your content to MP4/H.264
Android Supported media formats
List of supported media audio/video formats.
Good luck!
Related
I have enabled stats for nerds option in YouTube android app and played the same video in "Vivo V9" and "Nexus 5" device.
Vivo V9 : It played the video in WebM format which is basically "VP8 or VP9" codec.
Nexus 5 : It played the video in MP4 format which is basically "H264 or H265" codec.
So, based on the device YouTube app selects video codec.
Question : How does it do ? I know internally it uses ExoPlayer for playing video but ExoPlayer by default doesn't give functionality.
Different codecs may require different licences, which can cost much. Moreover codecs can be both software and hardware. There is no problem with many SW codecs but as name states HW codecs require specific chip which also increases cost and occupy space. That's why there is big variety between device manufactures and even between two devices from the same manufacturer. They just simply want to cut costs.
It's very common that one device has only a subset of most popular codecs and even if it has let's say both VP8 and H264 then one of them can be HW codec while other can be SW codec in which case usually HW codec will be preferred. YouTube (here I mean youtube.com not YouTube app) serves videos in different formats, so the device can choose optimal codec for it's capabilities.
Now as to choosing the right codec YouTube app can use MediaCodec API from Android. Please check e.g. this and / or it even can provide it's own SW codecs, so I would say that the behavior is platform dependent.
Last things are corner cases e.g. something can be played / recorded in background e.g. camera is turned on and screen recording happens while using YouTube app. Again it depends on device but HW codecs have limitations to the number of media codec instances and in such corner case it's possible that even if device has some HW codec YouTube app may be forced to choosing SW codec.
Edit:
Starting from Android Q there is also new API which lets you to easily differentiate between SW and HW codecs. Take a look at this.
I am building an app which allows users to upload videos from different devices including android and iOS then stream them from server with VideoView during playback. I end up with different video formats and get the inevitable "Cannot play this video" error on several occasions.
Some research says that videos with format .mp4 and codec H.264 can work on all devices, so I have a few things I am working on.
Convert all videos on server side to the above mentioned format and codec
Use ffmpeg to convert the videos in app during playback
Use VLC sdk which supports a wide range of video formats
I am not sure which if these is the best solution, I have not worked with videos a lot in the past on Android and I am not sure what the pros and cons maybe or if indeed these are viable solutions or if this problem already has a known solution.
I've set up Apache 2.0 with several .m3u8 files serving a set of mpeg2ts files over HLS. These ts files were produced with libavformat by transmuxing an MP4 I downloaded from youtube. When I play the resulting HLS on VLC or QT, everything works fine. But on Android (Stagefright 1.2) the video has several problems:
The option to go full-screen does not work
The video duration says 1:40 when it is actually 2:00
The video sometimes fails to start and you have to reload page
The video reliably distorts (tears and pixelates) at transition points when switching the underlying .ts streams.
Some of this is ameliorated if I don't use HTML5's tag. But problem #4 remains.
I can play other m3u8's on Stagefright without any of the above problems, so I am assuming my transmuxing code is wrong, but even forgoing it and using the (recently added) HLS segmenting features of ffmpeg I have the same problem. Recoding with libx264 changes nothing.
I am at wit's end debugging this.
Android's libstagefright (along with mediaservice's NuPlayer) is not so mature product as vlc and a lot of troubles which are not present while using vlc are present in android it is much more vulnerable for any broken, corrupted, deviated content.
Such pixelation/macroblock artifacts are usually present while some frames where dropped (by android code or were lost) before decoding.
If those corruptions are present along with some green fields it might be a problem with synchronization of format change with a key frames (which might be a result of wrong implementation of source, or in part which notifies ANativeWindow about format change).
In corner case you might not get any green frames but crop/reolution would be deviated and pixelation might be visible).
What I would do:
1) Check for frame dropps
2) Check with some analyzer frames at the borders of consecutive sections
I wonder what is the best option to store a single picture and short voice memo in one file? That needs to be openable by mobile phones in browser (iOS, Android) and preferably be shown as a single full screen photo and sound playing in background.
Effectively i'm looking for a most size efficient combination of something like MP3 + JPG.
If i do it in a single .mov i guess i loose a lot of space due to compression of each and the same frame 24 frames/second.
A rough list of options which comes to mind:
.mov
Mpeg4
H.264
QuickTime
HTML5 video format (Theora?)
store it in Flash (but this is not supported by iOS)
EDIT1:
tried storing h.264 in .mp4 container, files are small enough (around 1Mb), but somehow it does not work on an Android phone of my friend. Probably i need more testing, but it seems Android OS does not like proprietary codecs...
My most intuitive solution for this would be to store a JPEG and an MP3 separately on the server. To download one entity as a single unit, download a bit of JSON or XML data that contains pointers to the picture and the audio file.
If you are set on having one file do the job, you might try embedding the JPEG inside the ID3 metadata of an MP3 file (this type of metadata functionality exists to, e.g., store album art with a music file). You would want to make sure that the ID3 tag is near the start of the file. JavaScript within the mobile browser could fetch the file, a third party library could do the ID3 parsing (some Googling reveals that such libraries exist; don't know if they all support fetching a JPEG image). Then the JS would need to be able to feed the file into an audio tag for playback, which I'm not sure is possible.
Another thing to experiment with is a .MP4 which encodes the audio track along with a single video frame with a really, reeeaaallly long duration. You would have to experiment to determine if the mobile browsers handle that gracefully while also allowing smooth audio seeking. If they don't, then perhaps re-encode the frame with every 1-5 seconds to keep the bitrate minimal.
I'm researching the development of an Android (2.2) app/service that will enable users to record short (I do emphasize short, < 30seconds) video on their phones and then upload that video (HTTP) to a server that will then transcode the video to other formats. That same user can download videos from other Android users and play them.
Now, I get a bit lost with everyones recommended approaches to all the issues in doing something like this because I haven't seen any ask this in a cohesive context. Ideally I would like a non commercial solution to this (as in no vendor/service being needed for the the video hosting/transcoding), but, feel free to include those as a recommendation (I've marked this as a wiki) as I know many like to use youtube and vimeo for the middle layer in all this.
The questions are
What server technologies do you
recommend for hosting and
transcoding?
What technology do you
recommend for streaming the video (it
would be nice to offer a high and
low quality encoding depending on
the users network connection)
What video format and software do you recommend for converting the uploaded video on the server to be viewable later by other Android owners.
Im assuming it's bad to do any transcoding on the phone prior to upload (battery/proc issues), but, if I'm wrong with that assumption what do you recommend?
Some things that may help you...
The video will only need to render on an Android device, and in the future in a webkit html5 browser.
Bandwidth isnt cheap (even with numerous 30 second videos), so a good mix of video quality and video file size is important (streaming if needed to ensure quality vs. download).
This is for android 2.2 devices with a video camera of course and medium to high density screen of 800x400 min.
Open source solutions (server to receive the uploads, code to do the transcoding, server to do the streaming) are preferred, but not required.
CDN's are an option, but I don't think that really figures in to the picture right now.
Check out this page to see all the video formats that Android supports for encoding and decoding.
http://developer.android.com/guide/appendix/media-formats.html
For encoding use FFmpeg or a service like encoding.com