Convert video Input Stream to RTMP - android

I want to stream video recording from my android phone to network media server.
The first problem is that when setting MediaRecorder output to socket, the stream is missing some mdat size headers. This can be fixed by preprocessing that stream locally and adding missing data to stream in order to produce valid output stream.
The question is how to proceed from there.
How can I go about output that stream as an RTMP stream?

First, let's unwind your question. As you've surmised, RTMP isn't currently supported by Android. You can use a few side libraries to add support, but these may not be full implementations or have other undesirable side effects and bugs that cause them to fail to meet your needs.
The common alternative in this case is to use RTSP. It provides a comparable session format that has its own RFC, and its packet structure when combined with RTP is very similar (sans some details) to your desired protocol. You could perform the necessary fixups here to transmute RTP/RTSP into RTMP, but as mentioned, such effort is currently outside the development scope of your application.
So, let's assume you would like to use RTMP (invalidating this thread) and that the above-linked library does not meet your needs.
You could, for example, follow this tutorial for recording and playback using Livu, Wowza, and Adobe Flash Player, talking with the Livu developer(s) about licensing their client. Or, you could use this client library and its full Android recorder example to build your client.
To summarize:
RTSP
This thread, using Darwin Media Server, Windows Media Services, or VLC
RTMP
This library,
This thread and this tutorial, using Livu, Wowza, and Adobe Flash Player
This client library and this example recorder
Best of luck with your application. I admit that I have a less than comprehensive understanding of all of these libraries, but these appear to be the standard solutions in this space at the time of this writing.
Edit:
According to the OP, walking the RTMP library set:
This library: He couldn't make the library demos work. More importantly, RTMP functionality is incomplete.
This thread and this tutorial, using Livu, Wowza, and Adobe Flash Player: This has a long tutorial on how to consume video, but its tutorial on publication is potentially terse and insufficient.
This client library and this example recorder: The given example only covers audio publication. More work is needed to make this complete.
In short: more work is needed. Other answers, and improvements upon these examples, are what's needed here.

If you are using a web-browser on Android device, you can use WebRTC for video capturing and server-side recording, i.e with Web Call Server 4
Thus the full path would be:
Android Chrome [WebRTC] > WCS4 > recording
So you don't need RTMP protocol here.
If you are using a standalone RTMP app, you can use any RTMP server for video recording. As i know Wowza supports H.264+Speex recording.

Related

Android WebRTC client with pre-encoded H.264 video stream

I have a video stream source that sends bytes of H.264-encoded video. I'd like to build an application with Android's WebRTC classes, to send this video stream to a WebRTC peer. These built-in classes seem to only support raw video sources... not video already processed by a codec.
I simply need to create an offer with only one video codec/bitrate configuration. For my use case, I don't need to autoscale the bandwidth usage, nor offer any codecs other than the original H.264 stream of bytes.
Is there a way to utilize the built-in Android WebRTC classes for this? If not, is there another set of WebRTC classes? Or, must I re-implement something to create that SDP offer and do all the peer connectivity and what not?

Streaming and recording video on android at the same time

I am searching for a library which offer ability for streaming video from android device (5.1+) and recording it at the same time.
I tried MediaRecorder - the usual way to record videos on android - but with it I am not able to stream it over webrtc or rtsp because camera is busy.
Currently I am using libstreaming. With little modification done app can record and stream over rtsp concurrently. But this lib lacks support for hardware codec in MTK and SPRG chipsets.
I am wonder if you can recommend a solution or another lib which.
By the moment lib works only on nexus 4 with qcom chipset.
After several days of research, I came to the decision to use a combination of FFMpeg and MediaCodec.
It seems that the only way to get frames from camera at high rate is to use Android MediaCodec API. But MediaCodec supports only mp4 file formats, which is not an option for me (I need ts), while FFMpeg can process\create any kind of human known video formats.
Currently I am trying to make it work together (read ByteBuffer from MediaCodec and feed FFMpeg recorder with it).
Useful links:
Grafika project: https://github.com/google/grafika
ContinuousCapture and Show + record are the most interesting parts to check
javacpp (specifically FFMpeg wrapper): https://github.com/bytedeco/javacpp
Has example with recording and streaming.
kickflip sdk: https://github.com/Kickflip/kickflip-android-sdk
The library which makes two mentioned above tools works together and also is open sourced. Sadly it doesn't solve my problem fully. The feature I need is requested but not already implemented: https://github.com/bytedeco/javacv/issues/95

How to do time-shifting of live audio stream on Android?

I'm working on a radio Android app in which I'd like to have options to rewind/fast-forward/back to live the audio stream.
It seems that it's not possible with Mediaplayer (I can't find any method to do that), so how can I do that?
The developer of the iOS version of the app is using the RadioKit SDK. Is there anything similar for Android?
I found this link that goes over some of the reasons why HTTP streaming isn't well-supported on Android. You can write your own HTTP streaming client and insert it as a proxy between the MediaPlayer and the media source, but that is the only way as far as I am aware. As far as trick mode, there is no real fast-forward or rewind protocol built into HTTP streaming. You have to simply request the correct byte from the server (see here for a little more info). The good news is it should be much easier to estimate the byte to request given a time position for audio than video (I've seen some pretty ridiculous algorithms for video).

Get frame from live video stream

I am streaming live video from my camera on my android phone to my computer using the MediaRecorder class.
recorder.setCamera(mCamera);
recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setOutputFile(uav_UDP_Client.pfd.getFileDescriptor());
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
That's the basic idea. So I would like to show this stream in real time. My plan is to use FFMpeg to turn the latest frame into a .bmp and show the .bmp on my C# program every time there is a new frame.
The problem is there is no header until I stop the recording. So I can not use FFMpeg unless there is a header. I've looked at spydroid and using RTP but I do not want to use this method for various reasons.
Any ideas on how I can do this easily?
You can consider streaming a MPEG2 TS and playing it back on your screen or you can also stream H.264 data over RTP and use a client to decode and display the same.
In Android, there is a sample executable which performs RTP packetization of H.264 stream and streams it over the network. You can find more details about the MyTransmitter from this file, which could serve as a good reference to your solution.
Additional Information
In Android 4.2.0 release onwards, there is a similar feature supported by the framework called Miracast or Wi-Fi Display which is standardized by Wi-Fi forum, which is a slightly complex use-case.

Streaming video from Android camera to server [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Closed 1 year ago.
Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
I've seen plenty of info about how to stream video from the server to an android device, but not much about the other way, ala Qik. Could someone point me in the right direction here, or give me some advice on how to approach this?
I have hosted an open-source project to enable Android phone to IP camera:
http://code.google.com/p/ipcamera-for-android
Raw video data is fetched from LocalSocket, and the MDAT MOOV of MP4 was checked first before streaming. The live video is packed in FLV format, and can be played via Flash video player with a build in web server :)
Took me some time, but I finally manage do make an app that does just that. Check out the google code page if you're interested: http://code.google.com/p/spydroid-ipcamera/
I added loads of comments in my code (mainly, look at CameraStreamer.java), so it should be pretty self-explanatory.
The hard part was actually to understand the RFC 3984 and implement a proper algorithm for the packetization process. (This algorithm actually turns the mpeg4/h.264 stream produced by the MediaRecorder into a nice rtp stream, according to the rfc)
Bye
I'm looking into this as well, and while I don't have a good solution for you I did manage to dig up SIPDroid's video code:
http://code.google.com/p/sipdroid/source/browse/trunk/src/org/sipdroid/sipua/ui/VideoCamera.java
I've built an open-source SDK called Kickflip to make streaming video from Android a painless experience.
The SDK demonstrates use of Android 4.3's MediaCodec API to direct the device hardware encoder's packets directly to FFmpeg for RTMP (with librtmp) or HLS streaming of H.264 / AAC. It also demonstrates realtime OpenGL Effects (titling, chroma key, fades) and background recording.
Thanks SO, and especially, fadden.
Here is complete article about streaming android camera video to a webpage.
Android Streaming Live Camera Video to Web Page
Used libstreaming on android app
On server side Wowza Media Engine is used to decode the video stream
Finally jWplayer is used to play the video on a webpage.
I am able to send the live camera video from mobile to my server.using this link
see the link
Refer the above link.there is a sample application in that link. Just you need to set your service url in RecordActivity.class.
Example as:
ffmpeg_link="rtmp://yourserveripaddress:1935/live/venkat";
we can able to send H263 and H264 type videos using that link.
Check Yasea library
Yasea is an Android streaming client. It encodes YUV and PCM data from
camera and microphone to H.264/AAC, encapsulates in FLV and transmits
over RTMP.
Feature:
Android mini API 16.
H.264/AAC hard encoding.
H.264 soft encoding.
RTMP streaming with state callback handler.
Portrait and landscape dynamic orientation.
Front and back cameras hot switch.
Recording to MP4 while streaming.
Mux (my company) has an open source android app that streams RTMP to a server, including setting up the camera and user interactions. It's built to stream to Mux's live streaming API but can easily stream to any RTMP entrypoint.
Depending by your budget, you can use a Raspberry Pi Camera that can send images to a server. I add here two tutorials where you can find many more details:
This tutorial show you how to use a Raspberry Pi Camera and display images on Android device
This is the second tutorial where you can find a series of tutorial about real-time video streaming between camera and android device

Categories

Resources