I'm trying to take a stream from a webcam and stream it to an android device. I use Gstreamer to grab the video and stream it out through a TCP server. That part of it works fine. The trouble I'm running into is that I need to make a custom app to receive the stream on the android and I can't get gst-android to compile (For reasons unknown to me, the adb is not a runnable, thus I can't set up the flingersinks.) Any suggestions? Is there something other and gst-android that I can use for this?
Which android version are you targeting? As far as I know, the ndk version of gstreamer will still have problems to render video as no one contributed a working video sink. The surfaceflinger api is not available to ndk apps :/
Related
I'm working on android app which can stream video to Facebook via compiled VLC-library. After recent changes in Facebook policy https://developers.facebook.com/blog/post/v2/2019/04/16/live-video-uploads-rtmps/ VLC stopped to stream video. There is message in the log:
standard stream out: no suitable sout access module for
'rtmp/flv://rtmps://live-api-s.facebook.com:443/rtmp/xxxxxxxxx.....'
Can anyone help me to understand - what should be done to re-enable streaming? My guess was to compile VLC with --enable-gnutls flag, but I'm not sure how to do this with current VLC sources
Direct use of Network Stream is one option you can try from here
Overview of the VideoLAN streaming solution -
Documentation
Related to your doubt on stream video to Facebook Preset with rtmp
Which version of VLC on Android are you using?
Could you provide a longer version of the logs?
According to this issue: https://code.videolan.org/videolan/vlc-android/issues/158
By setting the flag --enable-sout in compile-libvlc.sh it should maybe start working
I am working on the android video conference application. for that we are using AudioRecord for recording and get the buffer from the AudioRecord using
"read(buffer,readBufferSize, size - readBufferSize);"
For playing audio we are using AudioTrack. while playing and recording we are getting our self echo.
How to remove this Echo programmatically.
I think you should be setting the source as VOICE_COMMUNICATION. This setting should enable Android to use its internal AEC. If you need to support multiple devices and Android versions then test the AEC results on several devices and Android versions since results might not be the same across the board. You can also take a look at this blog post
This is not really a question as much as it is a presentation of all my attempts to solve one of the most challenging functionalities I was faced with.
I use libstreaming library to stream realtime videos to Wowza Server and I need to record it at the same time inside the SD card. I am presenting below all my attempts in order to collect new ideias from the community.
Copy bytes from libstreaming stream to a mp4 file
Development
We created an interception in libstreaming library to copy all the sent bytes to a mp4 file. Libstreaming sends the bytes to Wowza server through a LocalSocket. It users MediaRecorder to access the camera and the mic of the device and sets the output file as the LocalSocket's input stream. What we do is create a wrapper around this input stream extending from InputStream and create a File output stream inside it. So, every time libstreaming executes a reading over the LocaSocket's input stream, we copy all the data to the output stream, trying to create a valid MP4 file.
Impediment
When we tried to read the file, it is corrupted. We realized that there are meta information missing from the MP4 file. Specifically the moov atom. We tried to delay the closing of the streaming in order to give time to send this header (this was still a guessing) but it didn't work. To test the coherence of this data, we used a paid software to try to recover the video, including the header. It became playable, but it was mostly green screen. So this became an not trustable solution. We also tried using "untrunc", a free open source command line program and it couldn't even start the recovery, since there was no moov atom.
Use ffmpeg compiled to android to access the camera
Development
FFMPEG has a gradle plugin with a java interface to use it inside Android apps. We thought we could access the camera via command line (it is probably in "/dev/video0") and sent it to the media server.
Impediment
We got the error "Permission Denied" when trying to access the camera. The workaround would be to root the device to have access to it, but it make the phones loose their warranty and could brick them.
Use ffmpeg compiled to android combined with MediaRecorder
Development
We tried to make FFMPEG stream a mp4 file being recorded inside the phone via MediaRecorder
Impediment
FFMPEG can not stream MP4 files that are not yet done with the recording.
Use ffmpeg compiled to android with libstreaming
Development
Libstreaming uses LocalServerSocket as the connection between the app and the server, so we thought that we could use ffmpeg connected with LocalServerSocket local address to copy the streaming directly to a local file inside the SD card. Right after the streaming started, we also ran the ffmpeg command to start recording the data to a file. Using ffmpeg, we believed that it would create a MP4 file in the proper way, which means with the moov atom header included.
Impediment
The "address" created is not readable via command line, as a local address inside the phone. So the copy is not possible.
Use OpenCV
Development
OpenCV is an open-source, cross-platform library that provides building blocks for computer vision experiments and applications. It offers high-level interfaces for capturing, processing, and presenting image data. It has their own APIs to connect with the device camera so we started studding it to see if it had the necessary functionalities to stream and record at the same time.
Impediment
We found out that the library is not really defined to do this, but more as image mathematical manipulation. We got even the recommendation to use libstreaming (which we do already).
Use Kickflip SDK
Development
Kickflip is a media streaming service that provides their own SDK for development in android and IOS. It also uses HLS instead of RTMP, which is a newer protocol.
Impediment
Their SDK requires that we create a Activity with camera view that occupies the entire screen of the device, breaking the usability of our app.
Use Adobe Air
Development
We started consulting other developers of app's already available in the Play Store, that stream to servers already.
Impediment
Getting in touch with those developers, they reassured that would not be possible to record and stream at the same time using this technology. What's more, we would have to redo the entire app from scratch using Adobe Air.
UPDATE
Webrtc
Development
We started using WebRTC following this great project. We included the signaling server in our NODEJS server and started doing the standard handshake via socket. We were still toggling between local recording and streaming via webrtc.
Impediment
Webrtc does not work in every network configuration. Other than that, the camera acquirement is all native code, which makes a lot harder to try to copy the bytes or intercept it.
If you are willing to part with libstreaming, there is a library which can easily stream and record to a local file at the same time.
https://github.com/pedroSG94/rtmp-rtsp-stream-client-java
Clone the project and run the sample app. For example, tap "Default RTSP." Type in your endpoint. Tap "Start stream" then tap "Start record." Then tap "Stop Stream" and "Stop record." I've tested this with Wowza Server and it works well. The project can also be used as a library rather than standalone app.
As soon as OpenCV 3.0 is available (the RC1 can be downloaded here), we could add another option to this list:
Using OpenCV's built in Motion-JPEG encoder
I'm trying to decode a raw h264 stream on "older" Android versions.
I've tried MediaPlayer class and does not seem to support the stream format.
I can see the stream on other Cam Viewer apps from the market, so I figure there must be a way to do it, probably using the NDK.
I've read about OpenMAX and Stagefright, but couldn't find a working example about streaming.
Could someone please point me in the right direction?
Also, I'm reading in several places about "frameworks/av/include/media/stagefright/MediaSource.h" and other sources, but they don't seem to be either in the regular SDK or the NDK.
Where is this source located? is there another sdk?
Thanks in advance.
Update: I'm receiving a rtsp connection.
If you wish to perform only a simple experiment to verify certain functionality, you can consider employing the command line stagefright utility. Please do consider this condition where your streaming input may not be supported.
If you wish to build a more comprehensive player pipeline, you can consider the handling for rtsp as in here or http as in here. Please note that NuCachedSource2 implementation is essential for streaming input as this provides a page cache implementation which acts as a jitter for the streaming data.
Please do note one critical point: Command line stagefright utility doesn't render to the screen. Hence, if you wish to render, you will to implement the complete playback pipeline with rendering.
On a related note, if your input is streaming input, the standard player implementation does have support for streaming inputs as can be observed here. Did you face any issues with the same?
As fadden has already pointed out, your work is made far more simpler with the introduction of MediaCodec in Android 4.x.
You should use third-party libs like android-h264-decoder which uses JNI for increasing the performance! Also look at this lib Intel
Update: Media codec wasn't exposed until API 16 (Android 4.1), so that won't work for a 2.3.3 device.
Stagefright and OpenMAX IL were (and still are) internal components of Android. You can find the code (including MediaSource.h) at https://android.googlesource.com/platform/frameworks/av/+/master Note that the media framework has moved to a separate "tree" frameworks/av only recently. Before it was part of 'frameworks/base', e.g. https://android.googlesource.com/platform/frameworks/base/+/gingerbread/media/
i would like some advice about the best way to stream a only-video live stream from a server to:
Android (>4.0 is ok)
PC with web-browser
iOS
I would like to keep latency as low as 1/2 second.
I can use:
flash: works on PC but no iOS and no Android(works only on some tablets)
HLS: not good because of latency
proprietary library: it should work but i have to implement it everywhere
RTSP: works only on Android
Any other way? Is a proprietary library the way to go?
I'm working on Linux but i'm mainly interested in "use this technology" and not "use this code".
Not sure, but you can try HTTP streaming of MP4/3gp formats using a web server. Both Android and iOS supports HTTP streaming. But you need to implement Progressive Download.
Please specify on which OS you want to implement your server.
For Windows - you can use following binary to relocate your moov atoms to the beginning of media file to enable them for progressive download
http://notboring.org/devblog/2009/07/qt-faststartexe-binary-for-windows/
Let us know your progress.
You can implement FFmpeg Server for Live broadcast. It gives you various options. Enable/Disable options from its configuration file located at /etc/ffserver.conf
You can get detail documentation at
http://ffmpeg.org/ffserver.html
Rtsp might be the way to go , but that 1/2 second latency might be hard to get.
I guess for video only and if you don't buffer at all , this may work for ios anyway
https://github.com/mooncatventures-group/FFPlayer-tests
Android supports rtsp , but its not very good.
You can compile ffmpeg for android and write a simple player using OpenGL. I can't share the code because we did it for a client but its not to difficult.