Android: Stream Camera Data and Write it to Server - android

I stream webcam data to my client.
I can see the data is arriving by listening on('data'). However, when I create it I am not able to view it and it's probably garbage data or missing some headers. VLC cannot play it.
My next step is to make it real-time streamable to browser.
What am I doing wrong?
net = require('net');
fs = require('fs');
// Start a TCP Server
net.createServer(function (socket) {
console.log("client connected");
var file = fs.createWriteStream("temp.mp4");
socket.pipe(file, {end: false});
socket.on('end', function(){
console.log("ended");
});
}).listen(5000);
I tested to see if did it really capture video output:
$ mediainfo temp.mp4
General
Complete name : temp.mp4
Format : H.263
Format version : H.263
File size : 126 KiB
Video
Format : H.263
Width : pixel0
Height : pixel0
Color space : YUV
Chroma subsampling : 4:2:0
Bit depth : 8 bits
Compression mode : Lossy
And this is the following Android code for setting mediaRecorder (Assume socket is connected, no problem)
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.DEFAULT);
mediaRecorder.setVideoSize(320, 240);
mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.DEFAULT);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);
ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(socket);
mediaRecorder.setOutputFile(pfd.getFileDescriptor());
mediaRecorder.setMaxDuration(5000);
mediaRecorder.setMaxFileSize(5000000);

There are a few open source projects that solve this problem, such as Spydroid (browser/VLC streaming) and Android IP Camera (browser streaming). Your implementation seems similar to Spydroid, so maybe you can adapt some of its code.
The central problem is that MediaRecorder is writing raw video frames into the socket. It needs to wait until the video is finished to write the headers, but they need to appear at the beginning of the file. Since the socket is not seekable, the headers can't be written at the correct location. The projects linked above deal with this problem by packetizing the stream into RTSP (Spydroid) or "streaming" a series of still images to the browser (Android IP Camera).

Related

How to decode and play MPEG-TS stream on Android (188 byte TS packet)

I am making an application similar to DVB broadcast TV player based on Android. The data I can receive is a series of MPEG-TS packets, each packet may be 188 bytes or 204 bytes. This is not HLS and does not carry There are m3u8 files. May I ask how can I decode these MPEG-TS input streams on Android and play them?
The received data packet is a data packet that conforms to the MPEG-TS encapsulation standard, similar to this
enter image description here

RTSP 1080p live-streaming android client gets error (100,0)

My new surveillance camera just arrived, so I'm trying to write an app to live stream the video from it.
Since it came with basically no documentation, I installed the 'onvifer' android app which allows you to browse the camera's capabilities. This app works fine - gets the video and allows PTZ controls, etc. It reports the streaming url as:
rtsp://192.1.0.193:554/mpeg4
I tested the stream in the VLC windows client, and it's able to stream video from that URL as well. This makes me comfortable that the network is working OK.
The camera states the feed will be 1920x1080; VLC confirms this.
The basic code in my activity:
VideoView videoView = (VideoView)this.findViewById(R.id.VideoView);
videoView.setVideoURI(Uri.parse("rtsp://192.1.0.193:554/mpeg4"));
videoView.requestFocus();
videoView.start();
I've also given the app INTERNET permissions in AndroidManifest.xml, disabled authentication on the camera, and am running on a real device (not the emulator).
When I run the app, LogCat shows this immediately:
setDataSource IOException happend :
java.io.FileNotFoundException: No content provider: rtsp://192.1.0.193:554/mpeg4
at android.content.ContentResolver.openTypedAssetFileDescriptor (ContentResolver.java).
About 15 seconds later, the app shows a "Can't play this video" modal dialog box and this is added to LogCat:
MediaPlayer error (100, 0)
AudioSystem AudioFlinger server died!
MediaPlayer error (100, 0)
VideoView Error: 100,0
I've googled everything I can think of, but haven't found anything useful.
Any thoughts?
wild-ass-guess on your logcat and the RC=100... No SDP file or no equivalent for RTSP of the 'moov atom' block required to negotiate details of the stream /container/ codec/ format... You can get the AOSP code for mediaPlayer/videoView and grep the RC value in the source.
RTSP is gnarly to debug ( note the tools links ) and not assured to run inside a NAT'd network due to UDP issues. So, to get better result, you may have to look into forcing your config to do data channel on TCP an not UDP. Or it could be other issues , of which there are many.
If you really want to investigate, some possible tools below:
Use command line and CURL client to request your stream:
Android - Java RTSP Session Mgmt package on Git
Protocol dumps for CLI RTSP sessions to Youtube RTSP/SDP streams
To pursue the issue, you may need to get into the weeds with debug tools that track details of the protocol negotiation that preceeds the MediaPlayer actually starting play on the stream. That would include learning the RFP and the protocol details.
videoView.setVideoURI(“rtsp://192.1.0.193:554/mpeg4”);
Try your app on another phone.
You may find the problem is about the mobile device.
Try this
path:"rtsp://218.204.223.237:554/mobile/1/4C024DFE77DC717D/onnuvesj43xj7t26.sdp".
See whether the code has something wrong.

How to decode the H.264 video stream received from parcelfiledescriptor

I'm creating an Android application of live video streaming between two android phone. I've already established a socket connection between these devices. I'm capturing video in one device and send the stream to other device but currently I just want to save in the receiver side mobile device and save it. I'm recording using MediaRecorder in one device , so to stream to the receiver I,m using parcelfiledescriptor object by setting the data
Client side code
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H263);
mediaRecorder.setOutputFile(pfd.getFileDescriptor());
Receiver side code
pfd= ParcelFileDescriptor.fromSocket(s);
InputStream in = new FileInputStream(pfd.getFileDescriptor());
DataInputStream clientData = new DataInputStream(in);
OutputStream newDatabase = new FileOutputStream(file);
int available=in.available();
byte[] buffer = new byte[available];
int length;
while((length = in.read(buffer)) > 0)
{
newDatabase.write(buffer, 0, length);
}
newDatabase.close();
The video file is being created on the receiver side mobile, but it's not able to receive any bytes. So Do I've to decode the coming stream on the receiver side since the video stream sent is encoded while recording. So how can I decode the stream that is received ? I found some solution like MediaExtractor and MediaCodec...but will this work with live video capturing and moreover I'm testing on android version 2.3.6 GingerBread
Is it possible to decode the video stream from MediaCodec for version 2.3.6 or some other method is available ?
The video file is being created on the receiver side mobile, but it's not able to receive any bytes.
If I understand you right, you are getting no data from the socket. That is a separate problem, which has nothing to do with the video format, decoding or encoding.
To debug your sockets, it may be helpful to use a separate application which just dumps the recieved data. Once the data looks fine, you can go to the next step - decoding the video.
Second part of the problem is the video format. You are using mp4, which is not usable for streaming. Here is more info about the format structure. You can use mp4 to record a video into a local file and then transfer the whole file over socket somewhere, but true realtime streaming cannot be done because of the non-seekable nature of the socket (as described in the linked article). There is a block of metadata at the beginning of the file, which acts as a "table of contents" and without it, the previous data are just junk. The problem is, you can assemble a "table of contents" only after you got all the contents. But at that moment, the data was already sent through the socket and you cannot insert anything at its beginning.
There are few walkarounds, but that's just for your future research and I haven't used them yet.
The most intuitive way would be to switch from mp4 to mpeg-ts, a container designed for streaming. Take a look at a hidden constant in MediaRecorder.OutputFormat with value 8.
Another option is to pack the raw H.264 data into RTP/RTCP packets, which is again a protocol designed for streaming. Also your application would be able to stream to any device that support this protocol (for example a PC running VLC). To further reasearch, take a look at Spydroid IP camera, which does exactly the thing.

Android VideoView Buffer size

I have a android application that plays HLS (Http Live Streaming) videos using VideoView.
I am using Local http proxy for forwarding http requests from VideoView to main HLS server as my stream (transport segment) is encrypted.
Flow of my application:
0. Preparing VideoView with local proxy url. ex. "http:// localhost :9878/index.m3u8"
1. VideoView sends requests to my proxy for M3U8 and ts segments.
2. Proxy forwards requests for M3u8 and ts from VideoView to HLS server.
3. Proxy checks for transport stream requests and before sending response to VideoView decrypts transport stream and sends it to VideoView.
4. VideoView plays the video stream.
This is working properly but some times i get following error:
output buffer is smaller than decoded data size Out Length
When i get this error in logcat my video becomes garbage (green video)
I see this issue usually when video stream bitrate size increases, is there any workaround for this issue?

Broadcasting Android Camera Video

What I want is to broadcast an android camera video to remote locations, for anyone to watch that video on their mobile or website.
I've been succesful to unicast it to the vlc player on my pc.
I tried red5 server, Adobe media server, ffmpeg server but all in vail.
Each of them was only able to broadcast video from a prerecorded file but not from any live stream.
Can any one suggest me what i do.
I read (I think it was even on stackoverflow) that you can provide the MediaRecorder with a FileHandle of a TCP-Connection. Then you can listen to that connection, read the data, packetize it and resend it as a RTSP/RTP-Stream.
If I happen to find the original post, I'll reference it here.
EDIT:
The original Post was: Streaming Video From Android
And the part about the Filedescriptor is from: http://www.mattakis.com/blog/kisg/20090708/broadcasting-video-with-android-without-writing-to-the-file-system
Just in case, I cite the according example from the blog:
String hostname = "your.host.name";
int port = 1234;
Socket socket = new Socket(InetAddress.getByName(hostname), port);
ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(socket);
MediaRecorder recorder = new MediaRecorder(); // Additional MediaRecorder setup (output format ... etc.) omitted
recorder.setOutputFile(pfd.getFileDescriptor());
recorder.prepare();
recorder.start();
However this only sends the Video File Data over the wire. You can save it and then play it back. But as mentioned, it is not a stream, yet.
UPDATE:
You do not even have to use a TCP Socket for the first step. I just tripped over "LocalSocket"(1), that also gets you a FileHandle to feed the MediaRecorder. Those Local sockets are "AF_LOCAL/UNIX domain stream socket"s. See http://developer.android.com/reference/android/net/LocalSocket.html
I have not tried all the above myself as of today, but will pretty soon. So maybe I can be of more help in the near future :)
(1) LocalSocket is not usable on newer Android versions for security reasons! See Update from 2015-11-25.
UPDATE 2:
Just saw in the Android Sources the "OUTPUT_FORMAT_RTP_AVP". But it is hidden :( So I guess it will be available in future API versions of Android.
https://github.com/android/platform_frameworks_base/blob/master/media/java/android/media/MediaRecorder.java Line 219:
public static final int OUTPUT_FORMAT_RTP_AVP = 7;
I have not tried just tricking the hide by providing a hardcoded 7 ... If anybody does, please leave a comment here!
UPDATE 2015-11-25
I just ran into libstreaming: https://github.com/fyhertz/libstreaming
I did not look into it too deeply, but it seems there is a lot to be learned about streaming from Android from this project (if not only using it). I read there that the LocalSocket solution is invalid for newer Android versions :( But they present an alternative: ParcelFileDescriptor.

Categories

Resources