How to use live555 to stream a local video in Android - android

I am a rookie at Android and have been working on making Android a streaming server.Now I have known is that live555 can be a media server and be ported to Android.According to many resources, I know how to build livlib555.so with ndk-build and I have done that.
But the question is that I don't know how to use it.I want to stream some local video files just like that I can easily stream video files in my windows7 pc by run the live555mediaserver.exe(and access it with "rtsp://ip:80/mp4_file_name").I am looking for a same way(or similar) in android.

Related

streaming rtsp server (android)

I am making an android application for streaming local video files to other android phones.
For this purpose, I found the libstreaming library, but it is designed to stream video from the camera phone.
How can I stream local video files using this library?
Are there any other libraries with which you can implement streaming?
I tried to use headless vlc but headless vlc does not exist in android.
I tried to use gstreamer (gst-launch) but MediaPlayer is required to sdp the file.
ffserver is not supported..

Android: Recording and Streaming at the same time

This is not really a question as much as it is a presentation of all my attempts to solve one of the most challenging functionalities I was faced with.
I use libstreaming library to stream realtime videos to Wowza Server and I need to record it at the same time inside the SD card. I am presenting below all my attempts in order to collect new ideias from the community.
Copy bytes from libstreaming stream to a mp4 file
Development
We created an interception in libstreaming library to copy all the sent bytes to a mp4 file. Libstreaming sends the bytes to Wowza server through a LocalSocket. It users MediaRecorder to access the camera and the mic of the device and sets the output file as the LocalSocket's input stream. What we do is create a wrapper around this input stream extending from InputStream and create a File output stream inside it. So, every time libstreaming executes a reading over the LocaSocket's input stream, we copy all the data to the output stream, trying to create a valid MP4 file.
Impediment
When we tried to read the file, it is corrupted. We realized that there are meta information missing from the MP4 file. Specifically the moov atom. We tried to delay the closing of the streaming in order to give time to send this header (this was still a guessing) but it didn't work. To test the coherence of this data, we used a paid software to try to recover the video, including the header. It became playable, but it was mostly green screen. So this became an not trustable solution. We also tried using "untrunc", a free open source command line program and it couldn't even start the recovery, since there was no moov atom.
Use ffmpeg compiled to android to access the camera
Development
FFMPEG has a gradle plugin with a java interface to use it inside Android apps. We thought we could access the camera via command line (it is probably in "/dev/video0") and sent it to the media server.
Impediment
We got the error "Permission Denied" when trying to access the camera. The workaround would be to root the device to have access to it, but it make the phones loose their warranty and could brick them.
Use ffmpeg compiled to android combined with MediaRecorder
Development
We tried to make FFMPEG stream a mp4 file being recorded inside the phone via MediaRecorder
Impediment
FFMPEG can not stream MP4 files that are not yet done with the recording.
Use ffmpeg compiled to android with libstreaming
Development
Libstreaming uses LocalServerSocket as the connection between the app and the server, so we thought that we could use ffmpeg connected with LocalServerSocket local address to copy the streaming directly to a local file inside the SD card. Right after the streaming started, we also ran the ffmpeg command to start recording the data to a file. Using ffmpeg, we believed that it would create a MP4 file in the proper way, which means with the moov atom header included.
Impediment
The "address" created is not readable via command line, as a local address inside the phone. So the copy is not possible.
Use OpenCV
Development
OpenCV is an open-source, cross-platform library that provides building blocks for computer vision experiments and applications. It offers high-level interfaces for capturing, processing, and presenting image data. It has their own APIs to connect with the device camera so we started studding it to see if it had the necessary functionalities to stream and record at the same time.
Impediment
We found out that the library is not really defined to do this, but more as image mathematical manipulation. We got even the recommendation to use libstreaming (which we do already).
Use Kickflip SDK
Development
Kickflip is a media streaming service that provides their own SDK for development in android and IOS. It also uses HLS instead of RTMP, which is a newer protocol.
Impediment
Their SDK requires that we create a Activity with camera view that occupies the entire screen of the device, breaking the usability of our app.
Use Adobe Air
Development
We started consulting other developers of app's already available in the Play Store, that stream to servers already.
Impediment
Getting in touch with those developers, they reassured that would not be possible to record and stream at the same time using this technology. What's more, we would have to redo the entire app from scratch using Adobe Air.
UPDATE
Webrtc
Development
We started using WebRTC following this great project. We included the signaling server in our NODEJS server and started doing the standard handshake via socket. We were still toggling between local recording and streaming via webrtc.
Impediment
Webrtc does not work in every network configuration. Other than that, the camera acquirement is all native code, which makes a lot harder to try to copy the bytes or intercept it.
If you are willing to part with libstreaming, there is a library which can easily stream and record to a local file at the same time.
https://github.com/pedroSG94/rtmp-rtsp-stream-client-java
Clone the project and run the sample app. For example, tap "Default RTSP." Type in your endpoint. Tap "Start stream" then tap "Start record." Then tap "Stop Stream" and "Stop record." I've tested this with Wowza Server and it works well. The project can also be used as a library rather than standalone app.
As soon as OpenCV 3.0 is available (the RC1 can be downloaded here), we could add another option to this list:
Using OpenCV's built in Motion-JPEG encoder

how to create ts (transport stream) video file from mp4 in android

In my project I need to implement an HLS (HTTP live Streaming) for an android device and it stream to an iOS device to play where android device will record the video and send it to server and iOS device will play the stream from the server using an m3u8 file. In the link below
Click Here
They have mention "Currently, the supported delivery format is MPEG-2 Transport Streams for audio-video".
Now problem is that in android you can record only in mp4 by default (correct me if i am wrong). Now I need some third party API or library like ffmpeg, Gstreamer, Xuggler, Jcodec to transcode recorded mp4 to ts files.
ffmpeg, jffmpeg and Gstreamer have a learning curve and to setup time and also need NDK. So I need some help because I don't have enough time to try one of these please refer me if you know any library which is easy to use and does not have a complex learning and setup time. Like Jcodec which is pure java base and plug and play type library but I don't think it can do this for me as they have mention in there documentation they support h262 codec support yet but i need h264 and ACC for audio.
FYI:
JJPMEG
It is a Java binding to FFmpeg and it have an android verison too. Maybe you can give it a try.
https://code.google.com/p/jjmpeg/
Or:
Maybe you can just record the video with supporting encoding and transcode the video in the server side?

Livestreaming video from android device to server using openCV in android

My basic requirement is to stream live-video from android device to server. When I was researching about the same, I came across OpenCV 2.4.6. My question here is it possible to stream the video from android camera to server using OpenCV 2.4.6 in android. If so can u please suggest how to go about it...
I don't know if you can use OpenCV for streaming video from android device to PC.In my knowledge OpenCV is an Image Processing library that can be used to do process images and videos for,Face recognition,Image compression,Augmented Reality etc. My idea is to create a Socket between the Client device and the Server PC and pass the video through that socket just like what they did with this spydroid application

How to stream over RTMP on Android?

I'm trying to play video file on a remote server. Video format is flv and server is Flash Media Server3.5.
I'm going to connect to server over RTMP and to implement the palyback of video file using Android Media Player.
Really,is it possible? Any help is my pleasure.
http://www.aftek.com/afteklab/aftek-RTMP-library.shtml
I found this one, but haven't had much luck, there are very few docs and after jigging it to try and support Video (no examples as i can see) i found that the core method RtmpStreamFactory.getRtmpStream(); failed.
This one has also cropped up, but i haven't looked at this yet.
http://code.google.com/p/android-rtmp-client/
It looks like that for me i'll be looking at getting the media server to deliver rtsp instead and this is supported by android. You may also find that later versions of Android i.e. 3> support rtmp.

Categories

Resources