How does Skype technically process video streams on older Android devices? - android

Whereas on Android 4.2 and up camera2 allows me to grab the raw data stream from the camera, older devices only allow me to write the encoded stream to a file.
I ran Skype on some older devices (e.g. SDK 10) and was able to make a video call, which means that Skype must somehow be able to grab the unencoded stream before it is encoded and written to file.
I found some interesting articles on the web,
http://www.hnwatcher.com/r/1170899/Video-recording-and-processing-in-Android/
http://code.google.com/p/ipcamera-for-android/
https://github.com/NanoHttpd/nanohttpd/
but I don't see how this would be working reliably and across all devices.
I was able to read out the encoded file while Android was still writing the video, but the problem here was that Android writes the MOOV box to the end of the file and only when recording has stopped. So the information in the MDAT box is worthless before the file is closed.
Does anybody know of a library that I could use to grab the data stream from the camera and immediately use it as live stream? Has anybody tried to find out how Skype does this technically?

Related

Problem in using Android 10 API to capture audio playbac

I hope this message finds you well. We are writing to ask some help on Android 10 API and its new functionalities. We are trying to build an application that captures only the important parts of videos and audios without saving the whole thing.
To do that our first objective is to capture playback audio in Android 10. We have tried the following two methods:
1st Method:
We have tried to capture audio on Youtube using Playback capture API introduced for android 10 but the resultant audio file produced only silence. In the documentation, it was written that when one wants to capture an audio file on a third party application, one can use "allowAudioplaybackCapture = true" in the manifest. We have already used this in manifest but it did not work.
We tried this method on YouTube first. It’s very likely that YouTube prevents capturing audio playback so we tried on a local audio file. The result was the same as before.
2nd Method:
We tried to record internal audio capture with media projection which is allowed for android 5.0 to android 10 (It works fine). The problem is that it captures internal audio along with external audio i.e microphone data. When we tried to mute the external audio capture, it also muted the internal capture.
Please see the code block below to have a better idea:
https://pastebin.pl/view/b2f4ec78
We would be grateful if you can give us some pointers. We have tried every documentation that is available but we could not find any solution.

How can I capture the raw audio stream on an android device?

I am trying to investigate if it is possible to get hold of audio stream on android device (for example android tv) which I can use for some purpose from inside my app (my app can be system app). Lets say something is playing on an android device (it could be anything that use the audio part of an android device). From my application , I would like to intercept the audio stream to analyze it, to record it, etc.
Any advice is appreciated.

Android: Recording and Streaming at the same time

This is not really a question as much as it is a presentation of all my attempts to solve one of the most challenging functionalities I was faced with.
I use libstreaming library to stream realtime videos to Wowza Server and I need to record it at the same time inside the SD card. I am presenting below all my attempts in order to collect new ideias from the community.
Copy bytes from libstreaming stream to a mp4 file
Development
We created an interception in libstreaming library to copy all the sent bytes to a mp4 file. Libstreaming sends the bytes to Wowza server through a LocalSocket. It users MediaRecorder to access the camera and the mic of the device and sets the output file as the LocalSocket's input stream. What we do is create a wrapper around this input stream extending from InputStream and create a File output stream inside it. So, every time libstreaming executes a reading over the LocaSocket's input stream, we copy all the data to the output stream, trying to create a valid MP4 file.
Impediment
When we tried to read the file, it is corrupted. We realized that there are meta information missing from the MP4 file. Specifically the moov atom. We tried to delay the closing of the streaming in order to give time to send this header (this was still a guessing) but it didn't work. To test the coherence of this data, we used a paid software to try to recover the video, including the header. It became playable, but it was mostly green screen. So this became an not trustable solution. We also tried using "untrunc", a free open source command line program and it couldn't even start the recovery, since there was no moov atom.
Use ffmpeg compiled to android to access the camera
Development
FFMPEG has a gradle plugin with a java interface to use it inside Android apps. We thought we could access the camera via command line (it is probably in "/dev/video0") and sent it to the media server.
Impediment
We got the error "Permission Denied" when trying to access the camera. The workaround would be to root the device to have access to it, but it make the phones loose their warranty and could brick them.
Use ffmpeg compiled to android combined with MediaRecorder
Development
We tried to make FFMPEG stream a mp4 file being recorded inside the phone via MediaRecorder
Impediment
FFMPEG can not stream MP4 files that are not yet done with the recording.
Use ffmpeg compiled to android with libstreaming
Development
Libstreaming uses LocalServerSocket as the connection between the app and the server, so we thought that we could use ffmpeg connected with LocalServerSocket local address to copy the streaming directly to a local file inside the SD card. Right after the streaming started, we also ran the ffmpeg command to start recording the data to a file. Using ffmpeg, we believed that it would create a MP4 file in the proper way, which means with the moov atom header included.
Impediment
The "address" created is not readable via command line, as a local address inside the phone. So the copy is not possible.
Use OpenCV
Development
OpenCV is an open-source, cross-platform library that provides building blocks for computer vision experiments and applications. It offers high-level interfaces for capturing, processing, and presenting image data. It has their own APIs to connect with the device camera so we started studding it to see if it had the necessary functionalities to stream and record at the same time.
Impediment
We found out that the library is not really defined to do this, but more as image mathematical manipulation. We got even the recommendation to use libstreaming (which we do already).
Use Kickflip SDK
Development
Kickflip is a media streaming service that provides their own SDK for development in android and IOS. It also uses HLS instead of RTMP, which is a newer protocol.
Impediment
Their SDK requires that we create a Activity with camera view that occupies the entire screen of the device, breaking the usability of our app.
Use Adobe Air
Development
We started consulting other developers of app's already available in the Play Store, that stream to servers already.
Impediment
Getting in touch with those developers, they reassured that would not be possible to record and stream at the same time using this technology. What's more, we would have to redo the entire app from scratch using Adobe Air.
UPDATE
Webrtc
Development
We started using WebRTC following this great project. We included the signaling server in our NODEJS server and started doing the standard handshake via socket. We were still toggling between local recording and streaming via webrtc.
Impediment
Webrtc does not work in every network configuration. Other than that, the camera acquirement is all native code, which makes a lot harder to try to copy the bytes or intercept it.
If you are willing to part with libstreaming, there is a library which can easily stream and record to a local file at the same time.
https://github.com/pedroSG94/rtmp-rtsp-stream-client-java
Clone the project and run the sample app. For example, tap "Default RTSP." Type in your endpoint. Tap "Start stream" then tap "Start record." Then tap "Stop Stream" and "Stop record." I've tested this with Wowza Server and it works well. The project can also be used as a library rather than standalone app.
As soon as OpenCV 3.0 is available (the RC1 can be downloaded here), we could add another option to this list:
Using OpenCV's built in Motion-JPEG encoder

Android - Viewing video stream with ADPCM encoded audio track

I am attempting to interface with a Motorola Blink1 camera (baby monitor) which hosts an unencrypted video+audio stream, the video is mjpeg but of particular interest is the ADPCM encoded audio stream. The video+audio feed is made available via a public URL on the local network.
Does anyone know how one might connect and decode such a video stream with the audio (I know OpenCV etc can do this without audio) within an Android app? Or failing that, any open source Java lib that can do this?
As an aside the desktop/web interface on the device uses the Java applet based GNU GPL v2 Cambozola viewer here:
http://charliemouse.com/code/cambozola/index.html
which Motorola have modified to add ADPCM support but do not appear to have released the modified source anywhere :-/ however it does indicate that this can be done...
Full credit to Motorola as after I wrote to their tech support requesting their modified GPL source for Cambozola, they did after a couple of months get their dev team to prepare the source code for their web viewer app and publish it here
https://github.com/nikhilvs/cambozola-bms
And it includes the ADPCM decoder routines integrated into it. For future reference they are prepending the JPEG stream with a short binary stream containing frame count, temperature info and the audio packets.
Kudos to your dev team Motorola for releasing your code in response to a tech support query I really wasn't expecting such a positive response.
I'm working on this right now, and I've identified most of the things needed to decode the ADPCM stream. Documenting my progress here: http://www.surfrock66.com/improving-the-motorola-blink-baby-monitorcamera/
It's using an open source streamer (GPLv3) which they've modified, code here: https://code.google.com/p/mjpg-streamer/ I'm actually contacting Motorola to get the source now, as they are under obligation to publish their modifications.

StageVideo/NetStream on Android Downloads Entire Video Before Playing

I am (at long last) at the very end of a VOD project. It works perfectly, except on Android. Basically, on Android video will not play until the entire video has downloaded. A media server was well out of scope, so we are just serving the videos up from AWS S3. Works fantastically on iOS. Both streaming and downloading the video works exactly as you would expect it to. On Android, it just doesn't seem to want to play before the download finishes. It works well when using a server on the local network (I even see the occasional buffer, so I know it's not just quickly downloading), but nothing remote.
My only guess is that it is to do with the differences in the way iOS and Android stream video. On iOS, video streams via byte-range requests. Every few seconds, it will time itself out and request another range of bytes for the file. On Android, it only sends a single request for the entire file. Not sure how that could be fixed, however.
Does anyone have any tips or pointers here? Any help would be greatly appreciated here.
Happens on Android 4.4 and 4.3.
Using both a remote prod server we own and AWS S3.
AIR 3.9 with Flex 4.11
Utilizing StageVideo and NetStream
Test devices are a Nexus 5 and a Nexus 4
The issue was with the videos themselves. AIR for Android uses the standard approach to streaming where the entire file is requested and it reads it bit-by-bit (as opposed to iOS which requests specific byte-ranges repeatedly).
The problem here is that the player cannot begin playback until the video's metadata has been read. A standard h.264 encode sees the metadata (moov atom) located at the very end of the file, so the video does not begin until the entire video has been downloaded.
Easiest way I have found to fix this is re-encoding the videos through Handbrake with the "Web Optimized" option selected. This will ensure the metadata is located at the very beginning (byte 24, I believe) so the video should begin playing instantly.
Explanation from Adobe
Thread that gave me the idea to use the "Web Optimized" option

Categories

Resources