I'm trying to make an android app that sends live video from the phone camera to wowza media server using eclipse and android sdk.I tried to use the spydroid ip camera on code.google (this is the link https://code.google.com/p/spydroid-ipcamera/) but i could'nt know exactly what to change in this app to make it stream to my localhost wowza server.The tutorial that comes with spydroid is not clear(this is the link to the tutorial:https://code.google.com/p/spydroid-ipcamera/issues/detail?id=2) .Can you help me please ?
For all of you who still looking for answer - try looking at fyhertz lib. I think it's easy to apply and stable enough
Related
folks
I have an Android app that streams videos to Wowza Server. Right now, I am using libstreaming (https://github.com/fyhertz/libstreaming) in the Android app to livestream audio and video to Wowza.
It works fine, but I am building an open source solution and I would like to stop using Wowza (since it is a payed product) and start using nginx-rtmp-module (https://github.com/arut/nginx-rtmp-module). The problem is that libstreaming does not work with rtmp protocol, and, as much as I researched, I still couldn't find a good solution on the Android side to livestream to nginx.
Does anybody know any solution to do that? Did anybody already implemented it? Thanks in advance!
You can probably use ffmpeg to convert RTP into RTMP on server side.
e.g. Pipe UDP input to FFMPEG
I want to integrate Video Broadcasting and Streaming in my android application through Wowza server. I have tried many different thing like this demo from JavaCV and this one from AndroidHive.
But the only problem with first one is that it uses flv file format to broadcast on Wowza server. It uses FFmpegFrameRecorder to broadcast live video to wowza server (not VOD). To set format of the video broadcasted at server, it uses following method:
recorder.setFormat("flv");
So the main problem with this method is that when I use other formats, it doesn't work. For E.g.:
recorder.setFormat("mpeg"); //or something like mov, etc
The second one was appropriate and had exactly what I needed. But I am facing a weird scenario in that one. It works perfectly fine (can broadcast Video with Audio) for localhost links like rtsp://192.168.1.58:1935/live/myStream but it fails to Broadcast live links like rtsp://54.208.***.***:1935/live/myStream. It shows as playing in wowza server but I cannot see Video or hear Audio from that link.
Please suggest a way to overcome this problem so that I can get Video and Audio at my end while boradcasting
I have used this code to communicate between Wowza and Android for Video Broadcasting part. I completed the part with completed success. The link given shows quite a good explanation on the topic and the configuration of Wowza server that needs to be done in order to make the brodcasting from Android happen.
Try android-ffmpeg library. This will definitely help you!
I want to send live video stream from my android device to wowza streaming engine. I am using sample in this blog but I can not see the result on Test Players page.
Do I need to have a web server serving a page with a video player pointed to this video/app on wowza?
I found this little (but very useful) library with three examples: libstreaming
It works like a charm! Easy to install and develop.
Main point is to look at Wowza logs to understand if stream was successfully published or not.
Then, according to logs you will know what Application, Application instance and Stream name are used for publishing.
So you'll be able to set up any player (VLC for example) with those values and look if stream is viewable or not.
Accepted answer is ok. Libstreaming is working (kinda) but it did not fulfill my expectations so that it can be pushed to some production app. Since this question is quite old, i will share mine up-to-date solution (AS 2.1.2 - Marshmallow) which is using JavaCV. I've built boilerplate for android so it can be used in no time.
Here is url:
How to stream live video from android to Wowza via RTMP
I'm developing an android application to stream a live TV in Tunisia.
They have their server for streaming on the Web. they use Flash Media Server and rtmp protocol for streaming.
So, I have a problem with some devices that don't support flash media player.
Can you please help me to find a solution like convert rtmp to rtsp or any other solution
Thanks for help.
your question is very vague. If you need to install flash on some devices on which it isn't installed, then you have to do it manually via the
adb install your_path/flash.apk
As you know that flash isn't on the app store any more. The link to download the apk is this. Go to that page and download the apk.
Secondly if you want to know about RTMP(Real Time Messaging Protocol) and RTSP(Real Time Streaming Protocol) then this link will help you.
i need to make an application that can retrieve and send video using video streaming. but i just do not have any clue how to make it.
what i understand, video streaming is something like when the user want to watch video from the server like youtube, etc. i also want to add on that the application can send the video to the server using android application.
i am using android 2.2. if anyone is willing to give me any keyword i should search for to get to such a tutorial or any tips i should know, that would be a great help.
thank you.
As i tried few months before, android versions bellow 4.0 doesn't support live video streaming from phone to the server :(
you can pipeline capturing and sending chunk of video using Threads, so that it feels like a video streaming having very less delay :)
and for stored media content search for RTSP and related protocols for streaming i hope 2.2 supports this .