How to receive a video stream from source in my android app - android

I have a source that transmits video in H264 format (in this case it is Colasoft Packet Player which transmits video stream through IP:PORT) and I want to be able to listen and receive the video stream in my Android app.
I read a lot of things across the internet about Socket, DatagramSocket and AudioManager, but I'm totally confused about what exactly I need and how to implement that.
What I need is to be able to capture the video frame by frame.
I would love to get some help.

You can use the VLC Android library
And here there's an explanation on how to embed it into your app.

You can let ffmpeg do this job.
Have a look at:
https://github.com/alphons/SegmentationStreaming

Related

add text message to wowza live stream video broadcasting in android

I am using wowza gocoder sdk and wowza stream engine to live broadcast video. Now I want to enable user to comment on broadcasters video. I don't understand the procedure, where I should start. how can I synchronize the text message to the video. Do I need to implement it separately or wowza provide something to live stream the text as well. I will be grateful for any guidence because I have searched everywhere possible but no one give precise answer or solution

display video from h264 stream of an ip camera in android

In my ndroid application, I need to display H264 streams from a GrandStream IP Camera. I saw some topics about decoding H264 frames with MediaCodec in Android, but I really don't know where to start.
Before searching this topic, I thought that there were planty of open source library for that purpose but It seems there is not!
Can you show me a way where to start? Should I use Android's MediaCodec or is there any open source Java library for that?
You can refer to this site, It has a very thoroughly discussion and sample about Android Media Codec

Get frame from live video stream

I am streaming live video from my camera on my android phone to my computer using the MediaRecorder class.
recorder.setCamera(mCamera);
recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setOutputFile(uav_UDP_Client.pfd.getFileDescriptor());
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
That's the basic idea. So I would like to show this stream in real time. My plan is to use FFMpeg to turn the latest frame into a .bmp and show the .bmp on my C# program every time there is a new frame.
The problem is there is no header until I stop the recording. So I can not use FFMpeg unless there is a header. I've looked at spydroid and using RTP but I do not want to use this method for various reasons.
Any ideas on how I can do this easily?
You can consider streaming a MPEG2 TS and playing it back on your screen or you can also stream H.264 data over RTP and use a client to decode and display the same.
In Android, there is a sample executable which performs RTP packetization of H.264 stream and streams it over the network. You can find more details about the MyTransmitter from this file, which could serve as a good reference to your solution.
Additional Information
In Android 4.2.0 release onwards, there is a similar feature supported by the framework called Miracast or Wi-Fi Display which is standardized by Wi-Fi forum, which is a slightly complex use-case.

Best way to stream audio and video from an Android phone to a RTMP server

I'm looking for the best way to stream video and audio from an Android phone to a server using RTMP. I've spent some time on this problem and this far I've been able to stream video only using FFMPEG. There are many ways to build the lib for Android and I guess that with some more work I should be able to stream Audio too.
The thing is that I need to encode the video in h264 and the audio in AAC, and it would be very easy to do it using Android's MediaRecorder. So I started looking for a RTMP library for Android and found this Red5 port that seems to be working pretty well. Using it I can stream a video file stored on the phone very easily, and the audio is working too.
So my question is the following: Is there a way to connect Android's MediaRecorder output to the RTMP library? I guess the way would be to 'fake' a file in setOutputFile(), and then to send the data to the RTMP encoding methods, but I can't figure a way to do it.
Any clue is welcome, really. Thanks in advance.
you are able to write the data to a FileDescriptor instead of a file with setOutputFile().
FileDescriptor fd = mLocalSender.getFileDescriptor();
mRecorder.setOutputFile(fd);
so you can stream the data to a socket. to get the data from this socket looks like this:
mLocalServer = new LocalServerSocket(LOCAL_SOCKET_ADDRESS);
mLocalReceiver = new LocalSocket();
mLocalReceiver.connect(new LocalSocketAddress(LOCAL_SOCKET_ADDRESS));
mLocalSender = mLocalServer.accept();
if (mLocalReceiver != null)
{
mInputStream = mLocalReceiver.getInputStream();
}
i don't know how your RTMP library is working, but i think, it should be possible to pass the input stream to some methods of the library.
for further information you also can check out the spydroid application. it includes a lot of useful stuff for video streaming over RTP without any special streaming libraries.
May be using Red5 Media server can help you. There are working examples that comes with it which you can use for streaming purpose.

Streaming video from Android camera to server [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Closed 1 year ago.
Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
I've seen plenty of info about how to stream video from the server to an android device, but not much about the other way, ala Qik. Could someone point me in the right direction here, or give me some advice on how to approach this?
I have hosted an open-source project to enable Android phone to IP camera:
http://code.google.com/p/ipcamera-for-android
Raw video data is fetched from LocalSocket, and the MDAT MOOV of MP4 was checked first before streaming. The live video is packed in FLV format, and can be played via Flash video player with a build in web server :)
Took me some time, but I finally manage do make an app that does just that. Check out the google code page if you're interested: http://code.google.com/p/spydroid-ipcamera/
I added loads of comments in my code (mainly, look at CameraStreamer.java), so it should be pretty self-explanatory.
The hard part was actually to understand the RFC 3984 and implement a proper algorithm for the packetization process. (This algorithm actually turns the mpeg4/h.264 stream produced by the MediaRecorder into a nice rtp stream, according to the rfc)
Bye
I'm looking into this as well, and while I don't have a good solution for you I did manage to dig up SIPDroid's video code:
http://code.google.com/p/sipdroid/source/browse/trunk/src/org/sipdroid/sipua/ui/VideoCamera.java
I've built an open-source SDK called Kickflip to make streaming video from Android a painless experience.
The SDK demonstrates use of Android 4.3's MediaCodec API to direct the device hardware encoder's packets directly to FFmpeg for RTMP (with librtmp) or HLS streaming of H.264 / AAC. It also demonstrates realtime OpenGL Effects (titling, chroma key, fades) and background recording.
Thanks SO, and especially, fadden.
Here is complete article about streaming android camera video to a webpage.
Android Streaming Live Camera Video to Web Page
Used libstreaming on android app
On server side Wowza Media Engine is used to decode the video stream
Finally jWplayer is used to play the video on a webpage.
I am able to send the live camera video from mobile to my server.using this link
see the link
Refer the above link.there is a sample application in that link. Just you need to set your service url in RecordActivity.class.
Example as:
ffmpeg_link="rtmp://yourserveripaddress:1935/live/venkat";
we can able to send H263 and H264 type videos using that link.
Check Yasea library
Yasea is an Android streaming client. It encodes YUV and PCM data from
camera and microphone to H.264/AAC, encapsulates in FLV and transmits
over RTMP.
Feature:
Android mini API 16.
H.264/AAC hard encoding.
H.264 soft encoding.
RTMP streaming with state callback handler.
Portrait and landscape dynamic orientation.
Front and back cameras hot switch.
Recording to MP4 while streaming.
Mux (my company) has an open source android app that streams RTMP to a server, including setting up the camera and user interactions. It's built to stream to Mux's live streaming API but can easily stream to any RTMP entrypoint.
Depending by your budget, you can use a Raspberry Pi Camera that can send images to a server. I add here two tutorials where you can find many more details:
This tutorial show you how to use a Raspberry Pi Camera and display images on Android device
This is the second tutorial where you can find a series of tutorial about real-time video streaming between camera and android device

Categories

Resources