I have a plan to develop an instrument app, when we shake the android phone, it will produce "angklung" (Google it) sound.
THE PROBLEM:
How to make one android phone can share its produced sound (by shake
gesture) to the other android phones having my application?
The connection that I want to use is mobile data connection and wi-fi.
I think this person has the same problem, but I don't know how to communicate with him. Stream android to android
But there is no help..
I need solution/example/suggestion for this problem. So far I succeed to produce the "angklung" sound when it is shaken.
I have no idea how to start this application. I've searched in the internet but there is no help :(
Thanks for your help.
I would give you the suggestion of streaming the audio data to a server and beaming that to other android devices (that are registered to your app). As the question/issue you have asked are way bigger than couple of lines code, hence am pointing you to some good resources, dig those deep & good luck.
Live-stream video from one android phone to another over WiFi
Stream Live Android Audio to Server
I want to stream video file (any supported format) from Android phone device and display it directly in another desktop app built using WPF.
I want also to make the same thing but video comes from Camera live.
For the camera, I found some solutions. one of them is this library https://github.com/Teaonly/android-eye but I have a problem with it because there is no direct url to stream. it has ip like this http://192.168.238.102:8080/ and it opens web page with settings and buttons and the video display url is http://192.168.238.102:8080/stream/live.jpg?id=58 which mean I should read images one by one. I don't know if this is a good streaming mechanism.
Also I found this article : http://www.androidhive.info/2014/06/android-streaming-live-camera-video-to-web-page/ but it requires server implementation on the .NET side and we need to buy a license.
I still did not find something for the video playing. And also I'm looking for a simpler way for the job.
We have to capture the real-time video using Android Camera, and send them to the server, then other users would read them through the browser or something else.
I have Googled and searched at SO, and there are some examples about video stream app like:
1 Android-eye: https://github.com/Teaonly/android-eye
2 Spydroid-ipcamera:https://code.google.com/p/spydroid-ipcamera/
However it seems that they have different environments, most of the apps will start an HTTP server for stream requests, then the client will visit the page through the local network and see the video.
Then the video stream source and the server are both the device like this:
But we need the internet support like this:
So I wonder if there are any alternative ideas.
I can see you have designed the three stages correctly, in your second diagram.
So what you need is to determine how to choose among these protocols and how to interface them.
No one can give you a complete solution but having completed an enterprise project on Android Video Streaming I will try to straighten your sight towards your goal.
There are three parts in your picture, I'll elaborate from left to right:
1. Android Streamer Device
Based on my experience, I can say Android does well sending Camera streams over RTP, due to native support, while converting your video to FLV gives you headache. (In many cases, e.g. if later you want to deliver the stream on to the Android devices.)
So I would suggest building up on something like spyDroid.
2. Streaming Server
There are tools like Wowza Server which can get a source stream
and put it on the output of the server for other clients. I guess
VLC can do this too, via File-->Stream menu, an then putting the
RTSP video stream address from your spyDroid based app. But I have
not tried it personally.
Also it is not a hard work to implement your own streamer server.
I'll give you an example:
For Implementation of an HLS server, you just need three things:
Video files, segmented into 10 second MPEG2 chunks. (i.e. .ts files)
An m3U8 playlist of the chunks.
A Web Server with a simple WebService that deliver the playlist to the Clients (PC, Android, iPhone, mostly every device) over HTTP. The clients will then look up the playlist file and ask for the appropriate chunks on their according timing. Because nearly all players have built-in HLS support.
3. The Client-Side
Based on our comments, I suggest you might want to dig deeper into Android Video Streaming.
To complete a project this big, you need much more research. For example you should be able to distinguish RTP from RTSP and understand how they are related to each other.
Read my answer here to get a sense of state-of-the-art Video Streaming and please feel free to ask for more.
Hope you got the big picture of the journey ahead,
Good Luck and Have Fun
Quite a general question, but I will try to give you a direction for research:
First of all you will need answer several questions:
1) What is the nature and purpose of a video stream? Is it security application, where details in stills are vital (then you will have to use something like MJPEG codec) or it will be viewed only in motion?
2) Are stream source, server and clients on the same network, so that RTSP might be used for more exact timing, or WAN will be involved and something more stable like HTTP should be used?
3) What is the number of simultaneous output connection? In other words, is it worth to pay for something like Wowza with transcoding add-on (and maybe nDVR too) or Flussonic, or simple solution like ffserver will suffice?
To cut long story short, for a cheap and dirty solution for couple of viewers, you may use something like IP Webcam -> ffserver -> VLC for Android and avoid writing your own software.
You can handle it this way:
Prepare the camera preview in the way described here. The Camera object has a setPreviewCallback method in which you register the preview callback. This callback provides data buffer (byte array) in YUV format that you can stream to your server.
I have a requirement where I need to detect human prescence from a live video feed which is coming from a phones camera
I was wonering if this was possible from the phone itself on an android phone (latest models such as the HTC Desire HD and higher perhaps)? And if possible can anyone guide me to a place (with links or such) to get an idea on how to proceed
However if this was not possible from within the phone itself is it possible to take a live video stream from the phone and transmit it to a server, which for example process the feed using open cv and sends an output back to the phone, can anyone tell me if the transmission of the live video feed to a server is possible and any guidance is appreciated as well.
Any suggestions?
Check this out
https://github.com/billmccord/OpenCV-Android
It may not be the most convenient thing to set up. But seems like it would work.
This might be helpful too.
http://www.slideshare.net/pickerweng/opencv-220-for-android
I find plenty of examples of downstreaming a video from a server to an android, but I actually want to stream live images from my droid to a server.
I know that Qik claims to do this. But as I am now reading the Wikipedia article more closely, it says that it doesn't really do live streaming for the iPhone or for Android.
For iPhone, it starts uploading after the recording has finished, and for Android, there are 15-20 seconds delay (according to Wikipedia).
So it seems if not even those Qik guys, who seem to have experience with live streaming, can do it, it's a very hard problem.
On the other hand, I have not tried Qik. Maybe you can install and test it, and do some traffic sniffing with Wireshark to see how they do it on the network level.