Screencast from Google Glass - android

I'm looking for an efficient way to stream what Google Glass's camera is seeing in real time. I know there're several alternatives, including MyGlass, but I'd like to build my own app since it will have several extra features.
My first approach was to just send frame by frame to the server, letting server side to compose the video, but I'm guessing there must be a more efficient approach. Anyone who can point me in the right direction to a better approach?

MyGlass does not deliver live video casting, nothing even close to this. I have easily built an RTSP streaming app around the wonderful libstreaming library, which has recently been updated to support Google Glass. Worked smoothly (but with some lag) over WiFi at 320x240 resolution with H264 codec.

See this example for generically doing this in android: http://www.mattakis.com/blog/kisg/20090708/broadcasting-video-with-android-without-writing-to-the-file-system

Related

Most instant way to stream live video to iOS and Android

I'm making an app that needs to send a video feed from a single source to a server where it can be accessed by desktop browsers and mobile apps.
So far, I've been using Adobe Media Server 5 with a live RTMP stream. This gives me about a 2.5 second delay on desktop browsers, which gives me no native support for iOS, but leaves me with the option to use Air to export the app for iOS, which produces a minimum 5-6 second delay.
The iOS docs strongly recommend the use of HTTP Live Streaming which segments the stream into chunks and serves it using a dynamic playlist in a .m3u8 file. Doing this produces a 15+ second delay in desktop browsers and mobile devices. A Google search seemed to reveal that this is to be expected from HLS.
I need a maximum of 2-4 second delays across all devices, if possible. I've gotten poor results with Wowza, but am open to revisiting it. FFMpeg seems inefficient, but I'm open to that as well, if someone has had good results with it. Anybody have any suggestions?? Thanks in advance.
I haven't even begun to find the most efficient way to stream to Android, so any help in that department would be much appreciated.
EDIT: Just to be clear, my plan is to make an iOS app, whether it's written natively or in Air. Same goes for Android, but I've yet to start on that.
In the ios browser HLS is the only way to serve live video. The absolute lowest latency would be to use 2 second segments with a 2 segment windows in the manifest. This will give you 4 seconds latency on the client, plus another 2 to 4 on the server. There is no way to do better without writing an app.
15 Second delay for HLS streams is pretty good, to provide lower latency you need to use a different streaming protocol.
RTP/RTSP will give you the lowest latency and is typically used for VoIP and video conferencing, but you will find it very difficult to use over multiple mobile and WiFi networks (some of them unintentionally block RTP).
If you can write an iOS app that supports RTMP then that is the easiest way to go and should work on Android too (only old Androids support Flash/RTMP natively). Decoding in software will result in poor battery life. There are other iOS apps that don't use HLS for streaming, but I think you need to limit it to your service (not a generic video player).
Also please remember that higher latency equals higher video quality, less buffering, better user experience etc. so don't unnecessarily reduce latency.

Transfer real-time video stream to server using Android

We have to capture the real-time video using Android Camera, and send them to the server, then other users would read them through the browser or something else.
I have Googled and searched at SO, and there are some examples about video stream app like:
1 Android-eye: https://github.com/Teaonly/android-eye
2 Spydroid-ipcamera:https://code.google.com/p/spydroid-ipcamera/
However it seems that they have different environments, most of the apps will start an HTTP server for stream requests, then the client will visit the page through the local network and see the video.
Then the video stream source and the server are both the device like this:
But we need the internet support like this:
So I wonder if there are any alternative ideas.
I can see you have designed the three stages correctly, in your second diagram.
So what you need is to determine how to choose among these protocols and how to interface them.
No one can give you a complete solution but having completed an enterprise project on Android Video Streaming I will try to straighten your sight towards your goal.
There are three parts in your picture, I'll elaborate from left to right:
1. Android Streamer Device
Based on my experience, I can say Android does well sending Camera streams over RTP, due to native support, while converting your video to FLV gives you headache. (In many cases, e.g. if later you want to deliver the stream on to the Android devices.)
So I would suggest building up on something like spyDroid.
2. Streaming Server
There are tools like Wowza Server which can get a source stream
and put it on the output of the server for other clients. I guess
VLC can do this too, via File-->Stream menu, an then putting the
RTSP video stream address from your spyDroid based app. But I have
not tried it personally.
Also it is not a hard work to implement your own streamer server.
I'll give you an example:
For Implementation of an HLS server, you just need three things:
Video files, segmented into 10 second MPEG2 chunks. (i.e. .ts files)
An m3U8 playlist of the chunks.
A Web Server with a simple WebService that deliver the playlist to the Clients (PC, Android, iPhone, mostly every device) over HTTP. The clients will then look up the playlist file and ask for the appropriate chunks on their according timing. Because nearly all players have built-in HLS support.
3. The Client-Side
Based on our comments, I suggest you might want to dig deeper into Android Video Streaming.
To complete a project this big, you need much more research. For example you should be able to distinguish RTP from RTSP and understand how they are related to each other.
Read my answer here to get a sense of state-of-the-art Video Streaming and please feel free to ask for more.
Hope you got the big picture of the journey ahead,
Good Luck and Have Fun
Quite a general question, but I will try to give you a direction for research:
First of all you will need answer several questions:
1) What is the nature and purpose of a video stream? Is it security application, where details in stills are vital (then you will have to use something like MJPEG codec) or it will be viewed only in motion?
2) Are stream source, server and clients on the same network, so that RTSP might be used for more exact timing, or WAN will be involved and something more stable like HTTP should be used?
3) What is the number of simultaneous output connection? In other words, is it worth to pay for something like Wowza with transcoding add-on (and maybe nDVR too) or Flussonic, or simple solution like ffserver will suffice?
To cut long story short, for a cheap and dirty solution for couple of viewers, you may use something like IP Webcam -> ffserver -> VLC for Android and avoid writing your own software.
You can handle it this way:
Prepare the camera preview in the way described here. The Camera object has a setPreviewCallback method in which you register the preview callback. This callback provides data buffer (byte array) in YUV format that you can stream to your server.

Viewing MJPEG Video Streams

I know Android doesn't support MJPEG natively but are there any jar files/drivers available that can be added to a project to make it possible?
There is a View available to display MJPEG streams:
Android and MJPEG Topic
Hardly, unless it's your Android platform (i.e. you are the integrator of special-purpose devices running Android).
A good place to start looking on how the Android framework handles video streams is here:
http://opencore.net/files/opencore_framework_capabilities.pdf
If you want to cook up something entirely incompatible, I guess you could do that with the NDK, jam ffmpeg into there, and with a bit of luck (and a nightmare supporting different Android devices) you can have it working.
What is the root problem you are trying to solve, perhaps we could work something out.
You can of course write or port software to handle any documented video format, the problem is that you won't have the same degree of hardware optimized code as the built in video codecs, and won't have as efficient low-level access to the framebuffer. So your code is likely to not be able to play back at full speed. Sometimes that might be okay, if you just want to get a sense of something. Also mjpeg compresses frames individually, so it should be trivial to write something that just skips a lot of frames and only decodes whatever fraction of them it can keep up with.
I think that some people have managed to build ffmpeg or mplayer using the optional features of the cpus in some phones and get to full frame rate for some videos, but it's tricky and device-specific.
I'm probably stating the obvious here, but MJPEG consists simply of multiple JPEGs. If you can grab the frames by cutting out data, you can probably get that data to be displayed as any other image.
I couldn't find any information on when exactly this was implemented, but as of now (testing on Android 8) you can view MJPEG stream just fine using a WebView.

Android how to video record, upload, transcode, download, play

I'm researching the development of an Android (2.2) app/service that will enable users to record short (I do emphasize short, < 30seconds) video on their phones and then upload that video (HTTP) to a server that will then transcode the video to other formats. That same user can download videos from other Android users and play them.
Now, I get a bit lost with everyones recommended approaches to all the issues in doing something like this because I haven't seen any ask this in a cohesive context. Ideally I would like a non commercial solution to this (as in no vendor/service being needed for the the video hosting/transcoding), but, feel free to include those as a recommendation (I've marked this as a wiki) as I know many like to use youtube and vimeo for the middle layer in all this.
The questions are
What server technologies do you
recommend for hosting and
transcoding?
What technology do you
recommend for streaming the video (it
would be nice to offer a high and
low quality encoding depending on
the users network connection)
What video format and software do you recommend for converting the uploaded video on the server to be viewable later by other Android owners.
Im assuming it's bad to do any transcoding on the phone prior to upload (battery/proc issues), but, if I'm wrong with that assumption what do you recommend?
Some things that may help you...
The video will only need to render on an Android device, and in the future in a webkit html5 browser.
Bandwidth isnt cheap (even with numerous 30 second videos), so a good mix of video quality and video file size is important (streaming if needed to ensure quality vs. download).
This is for android 2.2 devices with a video camera of course and medium to high density screen of 800x400 min.
Open source solutions (server to receive the uploads, code to do the transcoding, server to do the streaming) are preferred, but not required.
CDN's are an option, but I don't think that really figures in to the picture right now.
Check out this page to see all the video formats that Android supports for encoding and decoding.
http://developer.android.com/guide/appendix/media-formats.html
For encoding use FFmpeg or a service like encoding.com

Does anyone have an example of how to stream the camera to a server on android?

I find plenty of examples of downstreaming a video from a server to an android, but I actually want to stream live images from my droid to a server.
I know that Qik claims to do this. But as I am now reading the Wikipedia article more closely, it says that it doesn't really do live streaming for the iPhone or for Android.
For iPhone, it starts uploading after the recording has finished, and for Android, there are 15-20 seconds delay (according to Wikipedia).
So it seems if not even those Qik guys, who seem to have experience with live streaming, can do it, it's a very hard problem.
On the other hand, I have not tried Qik. Maybe you can install and test it, and do some traffic sniffing with Wireshark to see how they do it on the network level.

Categories

Resources