Are there some available component or demos that support red5 rtmp streaming?I prepare to set about a secondary development in android which will be used in Graduation Project
You have to use an RTMP library or something like JavaCV. If you have the coding skills you can actually use the built-in MediaCodec to decode h.264 / AAC; the tricky part is that Android doesn't include the RTMP decoder / demuxer; this is where JavaCV or an alternate RTMP library come in. If you want to dig into the red5-client code you could also try that to decode the RTMP, but that may be a good deal of work.
https://code.google.com/p/javacv/
http://www.aftek.com/afteklab/aftek-RTMP-library.shtml
Lastly, you could just use Adobe AIR.
Related
I am trying to build a video system on android. I am using the sample provided by Qualcomm, which allows me to use openmax and do hardware-acceleration on Qualcomm customer device.
Anyway, this sample only generates .h264 file. So I am looking forword a good way to do the muxer work. I've used MediaMuxer before, but it supports system later than android4.3, so this doesn't work on this sample. (Qualcomm sample only support android4.2 and before)
Does anyone have any ideas? Thank you!
you can use ffmpeg. build ffmpeg for android, create jni wrapper and easily expose muxing functionality to java level
Scenario:
I am working on a Android project where in one particular openGL page, I display videos.
FFmpeg is used to obtain frames from videos(as openGL does not support video as texture) and I am using the frames to obtain a video effect.
I am using pre-compiled FFmpeg binaries in the project.
I was not aware of the level of legal implications of using the FFmpeg library.
My superior brought this to my notice FFmpeg legal reference
Problem:
I am no legal expert, so only thing that I understood is that using FFmpeg in a comercial free app (but service needs to be paid for) is going to get me and my company into trouble :(
In no way the source or any part of the source of the project can be released.(The client is very strict about this.)
Questions?
1) Is there any alternative to FFmpeg (which uses Apache or MIT license) that I can use to obtain video frames?
2) In openGL, getting the video frames and looping through - Is it the only way of playing a video? Is there any alternate method to achieve this functionality?
IANAL, but LGPL means that if you compile and use ffmpeg as shared library (.so file) or standalone executable, then you are fine - even in closed source application that you sell for money.
I am writting an app which needs to decode H.264(AVC) bitstream. I find there are AVC codec sources exist in /frameworks/base/media/libstagefright/codecs/avc, does anyone know how can one get access to those codecs in an Android app? I guess it's through JNI, but not clear about how this can be done.
After some investigation I think one approach is to create my own classes and JNI interfaces in the Android source to enable using the CODECS in an Android App.
Another way which does not require any changes in Android source is to include CODECS as shared library in my application, use NDK. Any thoughts on these? Which way is better(if feasible)?
I didn't find much information about Stagefright, it would be great if anyone can point out some? I am developing on Android 2.3.3.
Any comments are highly appreciated.Thanks!
Stagefright does not support elementary H.264 decoding. However it does have H.264 decoder component. In theory, this library could be used. But in reality, it will be tough to use it as a standalone library due to its dependencies.
Best approach would be use to JNI wrapped independent h.264 decoder(like the one available with ffmpeg).
I'm trying to get the RTSP video stream play in my Android App using the build-in Videoview/MediaPlayer, but there're always various problems on different ROMs or different network status(UDP packets blocked), it's really annoying so I want to implement my own rtsp client with the live555 source and GLES and ffmpeg. I can figure out how to use ffmpeg and GLES to show a video, but I'm not familiar with live555.
Are there any compiled version of live555 on Android? or how could I do that myself?
Thanks.
I think I found a sample code from github, it works for me.
bad news - I think you won't find any precompiled versions of live555 - only a config-makefile-structure for several platforms - except android.
Since live555 is a pure c++ library you will most likely have problems with directly using the lib in Android.
jens.
Is there a way to access StageFright API's to decode JPEG image from application layer on Android 2.3?
No, Stagefright APIs are not exposed at Android Application Framework level. Android Media Player class abstracts the internal player frameworks like StageFright and OpenCore.
If you have the source code for Android, then you can use JPEG decoder present in StageFright (probably wrapped as OMX Component) through JNI.
No, it is not possible to use stagefright apis directly from the app layer. You will have to go through the java apis. But yes, if you are ready to write the jni layers hack around a LOT of code, it is possible in principle