https://ymcasports.org/index.cfm/action/ynational_highlights/content_action/ynational_highlights/league/3636/album/4314#15083
The videos on this page play fine on Android but do not play on chrome for android and only audio for firefox for android.
Your Video has 4:2:2 colorspace but most hardware decoders only support 4:2:0 profile currently (in a few years future, this might change).
Anyway, i see the video was created using ffmpeg already. All you need to do is to add -pix_fmt yuv420p to the ffmpeg command.
[EDIT] for video format related questions, in future, please post a mediainfo report as well: https://mediaarea.net/MediaInfoOnline
Related
We provide facility to upload and stream video lectures to different schools. I recommend H264 video codec to be uploaded. We are facing issue on android where H264 is supported by android 6.0.0 and afterwards. Which video format we should be using to stream videos both on android app and web app.
You are correct, the h.264 main profile is only supported in Android 6+
However, if you use h.264 baseline profile, you can go back to Android 3.
Source: Android Developers. You can also check out other formats that are supported there.
I am searching for a library which offer ability for streaming video from android device (5.1+) and recording it at the same time.
I tried MediaRecorder - the usual way to record videos on android - but with it I am not able to stream it over webrtc or rtsp because camera is busy.
Currently I am using libstreaming. With little modification done app can record and stream over rtsp concurrently. But this lib lacks support for hardware codec in MTK and SPRG chipsets.
I am wonder if you can recommend a solution or another lib which.
By the moment lib works only on nexus 4 with qcom chipset.
After several days of research, I came to the decision to use a combination of FFMpeg and MediaCodec.
It seems that the only way to get frames from camera at high rate is to use Android MediaCodec API. But MediaCodec supports only mp4 file formats, which is not an option for me (I need ts), while FFMpeg can process\create any kind of human known video formats.
Currently I am trying to make it work together (read ByteBuffer from MediaCodec and feed FFMpeg recorder with it).
Useful links:
Grafika project: https://github.com/google/grafika
ContinuousCapture and Show + record are the most interesting parts to check
javacpp (specifically FFMpeg wrapper): https://github.com/bytedeco/javacpp
Has example with recording and streaming.
kickflip sdk: https://github.com/Kickflip/kickflip-android-sdk
The library which makes two mentioned above tools works together and also is open sourced. Sadly it doesn't solve my problem fully. The feature I need is requested but not already implemented: https://github.com/bytedeco/javacv/issues/95
From within my app I create a video from images that the user has taken using Ffmpeg. I the play this using MediaController and a VideoView. When I run the app using the Genymotion emulator for a Google Nexus 4 the video file plays without issue. When I use the Genymotion emulator for a Samsung Galaxy S4 I get an error from the VideoView on error listener say "Can't play video".
Thanks for your help.
Converting the video using ffmpeg worked fine for my application. I hope this helps:
ffmpeg -i old.mp4 -c:v libx264 -profile:v baseline -level 1 -strict -2 new.mp4
I found this line of code somewhere else on Stack Overflow but I could not retrace where, unfortunately. So, in case someone comes across it, please link it here. The original one did not have the -strict -2 in the command.
Simply because the extension is .mp4 cannot guarantee it will be played by the MediaPlayer. If its not a supported encoding it wont play it. Please do have a look at all the supported media formats in Android here.
So introspect your code and find the type of encoding your video uses. Also if you are looking more an even powerful way to play video, you can also try out Google's Exoplayer, here.
I am building an app which allows users to upload videos from different devices including android and iOS then stream them from server with VideoView during playback. I end up with different video formats and get the inevitable "Cannot play this video" error on several occasions.
Some research says that videos with format .mp4 and codec H.264 can work on all devices, so I have a few things I am working on.
Convert all videos on server side to the above mentioned format and codec
Use ffmpeg to convert the videos in app during playback
Use VLC sdk which supports a wide range of video formats
I am not sure which if these is the best solution, I have not worked with videos a lot in the past on Android and I am not sure what the pros and cons maybe or if indeed these are viable solutions or if this problem already has a known solution.
What formats of video file are supported in the Android emulator?
I understand that it probably won't play in real time, but what ones will play at all?
The secret is that the emulator will play the MP4 baseline profile, while real devices will also play better MP4 profiles.
In order to get a video file that plays properly in the emulator, try these settings:
ffmpeg -i inputvideo.wmv -vcodec libx264 -vprofile baseline outputvideo.mp4
It supports H.263 encoding and decoding, H.264 AVC and MPEG-4 SP both only decoding.
On an emulator the playback quality in terms of speed or lags might be a bit cumbersome.
Checkout the chart of all supported media formats for more information.