Android - Viewing video stream with ADPCM encoded audio track - android

I am attempting to interface with a Motorola Blink1 camera (baby monitor) which hosts an unencrypted video+audio stream, the video is mjpeg but of particular interest is the ADPCM encoded audio stream. The video+audio feed is made available via a public URL on the local network.
Does anyone know how one might connect and decode such a video stream with the audio (I know OpenCV etc can do this without audio) within an Android app? Or failing that, any open source Java lib that can do this?
As an aside the desktop/web interface on the device uses the Java applet based GNU GPL v2 Cambozola viewer here:
http://charliemouse.com/code/cambozola/index.html
which Motorola have modified to add ADPCM support but do not appear to have released the modified source anywhere :-/ however it does indicate that this can be done...

Full credit to Motorola as after I wrote to their tech support requesting their modified GPL source for Cambozola, they did after a couple of months get their dev team to prepare the source code for their web viewer app and publish it here
https://github.com/nikhilvs/cambozola-bms
And it includes the ADPCM decoder routines integrated into it. For future reference they are prepending the JPEG stream with a short binary stream containing frame count, temperature info and the audio packets.
Kudos to your dev team Motorola for releasing your code in response to a tech support query I really wasn't expecting such a positive response.

I'm working on this right now, and I've identified most of the things needed to decode the ADPCM stream. Documenting my progress here: http://www.surfrock66.com/improving-the-motorola-blink-baby-monitorcamera/
It's using an open source streamer (GPLv3) which they've modified, code here: https://code.google.com/p/mjpg-streamer/ I'm actually contacting Motorola to get the source now, as they are under obligation to publish their modifications.

Related

Android: Recording and Streaming at the same time

This is not really a question as much as it is a presentation of all my attempts to solve one of the most challenging functionalities I was faced with.
I use libstreaming library to stream realtime videos to Wowza Server and I need to record it at the same time inside the SD card. I am presenting below all my attempts in order to collect new ideias from the community.
Copy bytes from libstreaming stream to a mp4 file
Development
We created an interception in libstreaming library to copy all the sent bytes to a mp4 file. Libstreaming sends the bytes to Wowza server through a LocalSocket. It users MediaRecorder to access the camera and the mic of the device and sets the output file as the LocalSocket's input stream. What we do is create a wrapper around this input stream extending from InputStream and create a File output stream inside it. So, every time libstreaming executes a reading over the LocaSocket's input stream, we copy all the data to the output stream, trying to create a valid MP4 file.
Impediment
When we tried to read the file, it is corrupted. We realized that there are meta information missing from the MP4 file. Specifically the moov atom. We tried to delay the closing of the streaming in order to give time to send this header (this was still a guessing) but it didn't work. To test the coherence of this data, we used a paid software to try to recover the video, including the header. It became playable, but it was mostly green screen. So this became an not trustable solution. We also tried using "untrunc", a free open source command line program and it couldn't even start the recovery, since there was no moov atom.
Use ffmpeg compiled to android to access the camera
Development
FFMPEG has a gradle plugin with a java interface to use it inside Android apps. We thought we could access the camera via command line (it is probably in "/dev/video0") and sent it to the media server.
Impediment
We got the error "Permission Denied" when trying to access the camera. The workaround would be to root the device to have access to it, but it make the phones loose their warranty and could brick them.
Use ffmpeg compiled to android combined with MediaRecorder
Development
We tried to make FFMPEG stream a mp4 file being recorded inside the phone via MediaRecorder
Impediment
FFMPEG can not stream MP4 files that are not yet done with the recording.
Use ffmpeg compiled to android with libstreaming
Development
Libstreaming uses LocalServerSocket as the connection between the app and the server, so we thought that we could use ffmpeg connected with LocalServerSocket local address to copy the streaming directly to a local file inside the SD card. Right after the streaming started, we also ran the ffmpeg command to start recording the data to a file. Using ffmpeg, we believed that it would create a MP4 file in the proper way, which means with the moov atom header included.
Impediment
The "address" created is not readable via command line, as a local address inside the phone. So the copy is not possible.
Use OpenCV
Development
OpenCV is an open-source, cross-platform library that provides building blocks for computer vision experiments and applications. It offers high-level interfaces for capturing, processing, and presenting image data. It has their own APIs to connect with the device camera so we started studding it to see if it had the necessary functionalities to stream and record at the same time.
Impediment
We found out that the library is not really defined to do this, but more as image mathematical manipulation. We got even the recommendation to use libstreaming (which we do already).
Use Kickflip SDK
Development
Kickflip is a media streaming service that provides their own SDK for development in android and IOS. It also uses HLS instead of RTMP, which is a newer protocol.
Impediment
Their SDK requires that we create a Activity with camera view that occupies the entire screen of the device, breaking the usability of our app.
Use Adobe Air
Development
We started consulting other developers of app's already available in the Play Store, that stream to servers already.
Impediment
Getting in touch with those developers, they reassured that would not be possible to record and stream at the same time using this technology. What's more, we would have to redo the entire app from scratch using Adobe Air.
UPDATE
Webrtc
Development
We started using WebRTC following this great project. We included the signaling server in our NODEJS server and started doing the standard handshake via socket. We were still toggling between local recording and streaming via webrtc.
Impediment
Webrtc does not work in every network configuration. Other than that, the camera acquirement is all native code, which makes a lot harder to try to copy the bytes or intercept it.
If you are willing to part with libstreaming, there is a library which can easily stream and record to a local file at the same time.
https://github.com/pedroSG94/rtmp-rtsp-stream-client-java
Clone the project and run the sample app. For example, tap "Default RTSP." Type in your endpoint. Tap "Start stream" then tap "Start record." Then tap "Stop Stream" and "Stop record." I've tested this with Wowza Server and it works well. The project can also be used as a library rather than standalone app.
As soon as OpenCV 3.0 is available (the RC1 can be downloaded here), we could add another option to this list:
Using OpenCV's built in Motion-JPEG encoder

How does Skype technically process video streams on older Android devices?

Whereas on Android 4.2 and up camera2 allows me to grab the raw data stream from the camera, older devices only allow me to write the encoded stream to a file.
I ran Skype on some older devices (e.g. SDK 10) and was able to make a video call, which means that Skype must somehow be able to grab the unencoded stream before it is encoded and written to file.
I found some interesting articles on the web,
http://www.hnwatcher.com/r/1170899/Video-recording-and-processing-in-Android/
http://code.google.com/p/ipcamera-for-android/
https://github.com/NanoHttpd/nanohttpd/
but I don't see how this would be working reliably and across all devices.
I was able to read out the encoded file while Android was still writing the video, but the problem here was that Android writes the MOOV box to the end of the file and only when recording has stopped. So the information in the MDAT box is worthless before the file is closed.
Does anybody know of a library that I could use to grab the data stream from the camera and immediately use it as live stream? Has anybody tried to find out how Skype does this technically?

Transfer real-time video stream to server using Android

We have to capture the real-time video using Android Camera, and send them to the server, then other users would read them through the browser or something else.
I have Googled and searched at SO, and there are some examples about video stream app like:
1 Android-eye: https://github.com/Teaonly/android-eye
2 Spydroid-ipcamera:https://code.google.com/p/spydroid-ipcamera/
However it seems that they have different environments, most of the apps will start an HTTP server for stream requests, then the client will visit the page through the local network and see the video.
Then the video stream source and the server are both the device like this:
But we need the internet support like this:
So I wonder if there are any alternative ideas.
I can see you have designed the three stages correctly, in your second diagram.
So what you need is to determine how to choose among these protocols and how to interface them.
No one can give you a complete solution but having completed an enterprise project on Android Video Streaming I will try to straighten your sight towards your goal.
There are three parts in your picture, I'll elaborate from left to right:
1. Android Streamer Device
Based on my experience, I can say Android does well sending Camera streams over RTP, due to native support, while converting your video to FLV gives you headache. (In many cases, e.g. if later you want to deliver the stream on to the Android devices.)
So I would suggest building up on something like spyDroid.
2. Streaming Server
There are tools like Wowza Server which can get a source stream
and put it on the output of the server for other clients. I guess
VLC can do this too, via File-->Stream menu, an then putting the
RTSP video stream address from your spyDroid based app. But I have
not tried it personally.
Also it is not a hard work to implement your own streamer server.
I'll give you an example:
For Implementation of an HLS server, you just need three things:
Video files, segmented into 10 second MPEG2 chunks. (i.e. .ts files)
An m3U8 playlist of the chunks.
A Web Server with a simple WebService that deliver the playlist to the Clients (PC, Android, iPhone, mostly every device) over HTTP. The clients will then look up the playlist file and ask for the appropriate chunks on their according timing. Because nearly all players have built-in HLS support.
3. The Client-Side
Based on our comments, I suggest you might want to dig deeper into Android Video Streaming.
To complete a project this big, you need much more research. For example you should be able to distinguish RTP from RTSP and understand how they are related to each other.
Read my answer here to get a sense of state-of-the-art Video Streaming and please feel free to ask for more.
Hope you got the big picture of the journey ahead,
Good Luck and Have Fun
Quite a general question, but I will try to give you a direction for research:
First of all you will need answer several questions:
1) What is the nature and purpose of a video stream? Is it security application, where details in stills are vital (then you will have to use something like MJPEG codec) or it will be viewed only in motion?
2) Are stream source, server and clients on the same network, so that RTSP might be used for more exact timing, or WAN will be involved and something more stable like HTTP should be used?
3) What is the number of simultaneous output connection? In other words, is it worth to pay for something like Wowza with transcoding add-on (and maybe nDVR too) or Flussonic, or simple solution like ffserver will suffice?
To cut long story short, for a cheap and dirty solution for couple of viewers, you may use something like IP Webcam -> ffserver -> VLC for Android and avoid writing your own software.
You can handle it this way:
Prepare the camera preview in the way described here. The Camera object has a setPreviewCallback method in which you register the preview callback. This callback provides data buffer (byte array) in YUV format that you can stream to your server.

Convert video Input Stream to RTMP

I want to stream video recording from my android phone to network media server.
The first problem is that when setting MediaRecorder output to socket, the stream is missing some mdat size headers. This can be fixed by preprocessing that stream locally and adding missing data to stream in order to produce valid output stream.
The question is how to proceed from there.
How can I go about output that stream as an RTMP stream?
First, let's unwind your question. As you've surmised, RTMP isn't currently supported by Android. You can use a few side libraries to add support, but these may not be full implementations or have other undesirable side effects and bugs that cause them to fail to meet your needs.
The common alternative in this case is to use RTSP. It provides a comparable session format that has its own RFC, and its packet structure when combined with RTP is very similar (sans some details) to your desired protocol. You could perform the necessary fixups here to transmute RTP/RTSP into RTMP, but as mentioned, such effort is currently outside the development scope of your application.
So, let's assume you would like to use RTMP (invalidating this thread) and that the above-linked library does not meet your needs.
You could, for example, follow this tutorial for recording and playback using Livu, Wowza, and Adobe Flash Player, talking with the Livu developer(s) about licensing their client. Or, you could use this client library and its full Android recorder example to build your client.
To summarize:
RTSP
This thread, using Darwin Media Server, Windows Media Services, or VLC
RTMP
This library,
This thread and this tutorial, using Livu, Wowza, and Adobe Flash Player
This client library and this example recorder
Best of luck with your application. I admit that I have a less than comprehensive understanding of all of these libraries, but these appear to be the standard solutions in this space at the time of this writing.
Edit:
According to the OP, walking the RTMP library set:
This library: He couldn't make the library demos work. More importantly, RTMP functionality is incomplete.
This thread and this tutorial, using Livu, Wowza, and Adobe Flash Player: This has a long tutorial on how to consume video, but its tutorial on publication is potentially terse and insufficient.
This client library and this example recorder: The given example only covers audio publication. More work is needed to make this complete.
In short: more work is needed. Other answers, and improvements upon these examples, are what's needed here.
If you are using a web-browser on Android device, you can use WebRTC for video capturing and server-side recording, i.e with Web Call Server 4
Thus the full path would be:
Android Chrome [WebRTC] > WCS4 > recording
So you don't need RTMP protocol here.
If you are using a standalone RTMP app, you can use any RTMP server for video recording. As i know Wowza supports H.264+Speex recording.

Accessing an mpeg-4 encoded RTSP live stream (Axis server) in Android with the .amp extension

I have a Samsung Galaxy Tab (Android 2.2 Froyo, etc) that I am developing on. I need the application to access a stream via IP, and the stream is originating from an Axis 241Q. The Axis is set to use .mp4 encoding, and I see that Android supports .mp4 natively. The Axis server also provides an RTSP URI to access the stream from media players via the local network.
Let me lead in to this by saying that I know less than nothing about video encoding standards and containers, so I apologize if this is a "no duh" issue.
My question is, how do I get to this stream using an Android VideoView? The Cliff's Notes version of the code I would use to start up the view in my Activity's onCreate():
VideoView v = (VideoView) findViewById(R.id.feed);
Uri video = Uri.parse("rtsp://local/path/to/feed.amp");
v.setVideoURI(video);
v.start();
I've used this with some test .3gp stream URI's that I've found on the internet and it seems to work fine, but all of the test streams that I found were done over HTTP and not RTSP so maybe I have to do a little more magic to get RTSP going; I can't imagine why that'd be though. I do know that Android supports RTSP in URI String resources for its MediaPlayers. Then again, I know nothing about streaming video so I may be wrong in assuming that it works the exact same way.
Regardless, when I attempt to access the Axis feed locally, the feed will not load; my assumption is that this results from use of the .amp extension instead of the ones listed in the Android docs but I have absolutely no idea. I can pass this URI to QuickTime and other such media players with positive results so I'm also assuming that the .amp file extension isn't THAT bizarre. I've had a hard time really finding out because Googling .amp with anything else, even using quotes and whatnot, yields a tedious set of results because of "amp" showing up in HTML escape characters.
The first question is, am I missing something obvious? I'm thinking not but there's a good chance that it's so.
The second question: is there a simple way to access this RTSP stream without having to brew up an insane solution on my own? Any existing and FREE libraries that are already in the wild and could make this easier on me would be a huge help. I was initially going to try out the gstreamer java bindings but after looking at the project page I saw that gstreamer relies on Swing and I don't believe Swing is included in the Android Java jars.
Can you provide the MPEG4 configuration of the stream?
Extension .amp can be replaced with .3gp on all Axis products. So try Uri.parse("rtsp://local/path/to/feed.3gp");. But, extension shouldn't make any difference in RTSP because media stream is determined by SDP, and not its "extension". So it can be media.jpg and server will actually stream H264 video, and not JPEG image.
If that doesn't work, try to configure your MPEG4 stream and be sure that you check ISMA compliant and set Video object type: SIMPLE (not Advanced Simple). That stream now can be played on all media players that decode MPEG4.
If you have difficulties, comment here, and I will update my answer to add new stuff.

Categories

Resources