I would like to build a live streaming and viewer.
The streaming is H264/MEPG4 raw data.
How to decode these raw data on Android?
I can not find usable API in Android SDK to do it.
And suggestion is welcome.
Thanks in advance.
Caxton
I use ffmpeg to decode live streaming.
Use MediaCodec API (since 4.1). Sample code here on David Marques's blog.
Related
I want to compress video using LZ4 library into my android application. I am using this library to compress my video file. Please help me to implement it correctly or suggest me any best alternative so that compression time should be fast. I have also tried with ffmpeg but it take too long time to compress video. Thanks in advance.
Hello guys is there any way or a codes to fasten a video streaming application, like they compress the video when uploaded.
Thanks a bunch for a help.
I suggest you check the WebRTC standard and protocols for this venture.
https://webrtc.org/
I have to play a single channel from an ogg file under Android. After a lot of searching, I think I found out a strategy.
Using OpenSL, I decode the file to PCM using something like this.
Then I should copy the selected channel from the PCM buffer into another buffer linked to the OutputMix, using something like this.
Is this the best option? Is there something already available to look at?
Thank you.
I solved using non-native code based on OpenMXPlayer.
Update: The general idea is to use MediaCodec to decode the file to memory, than modify the data in memory, than send it to an AudioTrack.
I have seen many questions related to this. Nevertheless there is not an answer for mine I think.
I would like to use an already coded RTSP Client on Android to use with MediaCodec in order to capture a RTSP stream in H264 to then decode and display it. I have used VideoView and MediaPlayer which are well-known to support RTSP streaming in the .setDataSource method (file or rtsp/http path) (unlike MediaExtractor which only supports file or http), but the latency is to high for my purposes.
I would like to use MediaExtractor, but because of that limitation on the setDataSource method it seems to be not an option. Given this, I am searching for some help or examples (tutorial?) that I could use as RTSP Client on Android, or if someone has used MediaExtractor in some way to capture the RTSP stream its help is more than welcome as well.
Thank you so much guys!
rojiark
You can try https://github.com/fyhertz/libstreaming
You should know though that is LGPL, which means the rest of your project will become LGPL and if you distribute the application you must also distribute the source code if requested.
I have a small application that stream pictures from a device using android. I'm able to take pictures and save into sdcard and now I have been willing to do some video recording with that. I have done it using IOS: I save the pictures into an array and for each pictures, I create a mjpeg video with these combined pictures.
I have thought of using pure java to do so, but most of the import libraries cannot be supported in Android.
I have tried to use this: https://github.com/lessthanoptimal/BoofCV/blob/master/main/io/src/boofcv/io/video/CreateMJpeg.java
But it cannot be played for some unknown reason.
I'm out of ideas. Please help me.
Thank you
You can use MJPEGGenerator.java to create an AVI from a set of jpg images
This is example of using MJPEGGenerator in russian language. Use translate.google.com to understand.