Working on Android 4.0+ above.
I am in process of analyzing ways to live stream my camera video to Window PC using RTP , encoding MPEG-2.
Is there readily available "rtp-server" in android 4.0+ ?
Is following true:: "The Android platform lacks support for
streaming protocol, which makes it difficult to stream live audio /
video to Android enabled devices." extracted from website
Currently I analyzed used the ffserver from the ffmpeg
libraries, but the FPS is < 5. which is far slow. Did any one
explored other solution which has more FPS?
Did anybody tried using StageFright for same? Capturing raw data
from camera and sending it to stagefright framework for encoding and
then streaming the same using RTP ??
Many Thanks.
The answers to your questions are as below. Though the links are related to Android 4.2.2, the same is true for Android 4.0 also.
Yes, there is a RTP transmitter available. You could look at this example in MyTransmitter as a starting point or you can consider using the standard recorder as in startRTPRecording.
You can stream data via RTP from an Android device to an external sink or you could have a different use-case as in Miracast a.k.a. Wi-Fi Display. However, streaming from one android device to another device through Wi-Fi Direct is still not completely enabled. The latter statement is mainly coming from Miracast scenario.
You can use the standard android software, which is capable of high resolution recording and transmission. This is mainly dependent on the underlying hardware as the overhead from software stack is not very high.
Yes. This is already answered in Q1 above.
Related
I am using Opentok SDK for video calling in IOS and Android devices with Nodejs server.
It is a group call scenario with max 4 people, when we stream for more than 10 min, both the devices getting too hot.
Does anyone have solution for this?
We can't degrade the video quality.
This is likely because you are using the default video code, VP8, which is not hardware accelerated. You can change the codec per publisher to either H.264 or VP8, but there are some trade-offs to this approach.
Their lack of H.264 SVC support is disappointing, but might be okay depending on your use case. If you read this whole post and still want more guidance, I'd recommend reaching out to their developer support team, and/or post more about your use case here.
Here's some more context from the OpenTok Documentation, but I recommend you read the whole page to understand where you need to make compromises:
The VP8 real-time video codec is a software codec. It can work well at lower bitrates and is a mature video codec in the context of WebRTC. As a software codec it can be instantiated as many times as is needed by the application within the limits of memory and CPU. The VP8 codec supports the OpenTok Scalable Video feature, which means it works well in large sessions with supported browsers and devices.
The H.264 real-time video codec is available in both hardware and software forms depending on the device. It is a relatively new codec in the context of WebRTC although it has a long history for streaming movies and video clips over the internet. Hardware codec support means that the core CPU of the device doesn’t have to work as hard to process the video, resulting in reduced CPU load. The number of hardware instances is device-dependent with iOS having the best support. Given that H.264 is a new codec for WebRTC and each device may have a different implementation, the quality can vary. As such, H.264 may not perform as well at lower bit-rates when compared to VP8. H.264 is not well suited to large sessions since it does not support the OpenTok Scalable Video feature.
We are developing an application which involves audio,video decoding and encoding. In some cases we need multiple decoders to be open at same time.
Problem :
Some devices doesnt support multiple(2 or more) decoders to be open at same time. This happens mostly for high resolution videos (1080p).
Assumptions
We think this is happening because of hardware limitations of the devices.
We need to know is there any apis which tells us the media codec capabilities in android like maximum number of codecs that can be opened at same time in any android device. We are fine with even if the API is in native level.
How about MediaCodecInfo.CodecCapabilities.getMaxSupportedInstances()?
What is the best (performance wise) way to get and stream a video from an android device's camera to a PC?
I have seen this question asked here before and there exist a few open source programs that do just that, but there exist multiple ways from which I don't know which one is the best!
for example:
Should the android part be written in c++ or java (performance/api wise)?
Which api should I use to get the video from camera?
What is the best way to stream the video?
I don't intend to support old android versions (<4.x), so if the best way/api is relatively new it's fine by me.
I'm not familiar with Android development but I'll try to answer.
I suppose that the actual encoding of the raw image data is probably done on hardware chip (otherwise software encoding would probably kill your battery) and it looks like MediaCodec class is exactly what you need. I suppose you want to implement some kind of live streaming service and the latency is important. If so, then you should stick to UDP based transmission methods. Using RTP protocol or MPEG-TS container format would be the best choice for this purpose. Of course you can also use TCP based methods for streaming like HLS or DASH (both of them use HTTP).
You should also take a look at Table 1 Core media format and codec support:
It tells us for example that using H.264 AVC Encoder supports MPEG-TS container and that HLS version 3 is also supported for Android 4.0 and above.
gang:
I'm an embedded software engineer starting to work on starting to work with Linux streaming solutions.
And got no experience with network and smartphone OSes.
When we stream video to PC(receiving via VLC) on PC side, we found there's a latency of 1.2 seconds for H.264. It includes
sensor grabs data
data is sent over the network
VLC buffers and plays
We found out after a while, that there's buffering control on VLC. For H.264 streaming, however, the minimum we can set is 400 -- 500 ms. However, on Android phones, we were NOT able to find some software that has very short(minimum) delays/buffering.
Can anyone suggest
How is latency generally measured/profiled for video streaming to smart phones?
Do you have any network sniffing software on Android/iOS to recommend?
I saw in Apple's documentation that HTTP live streaming recommends 10s "file size". Anyway to overcome this? (Is jailbreaking required for installing sniffing tool on iOS?)
Is it possible to perform adaptive (multibitrate) streaming onto an Android device? If yes, how to do that?
If you have 4.0 or 3.2 you just use access the adaptive stream as you would any other video. Literally.
It's a HTTP access.
So if you use as a data source //mywebsite/video1.mp4 you woulduse as a data source the equivalent //mywebsite/video1.m3u8. Now, I'm not including any discussion on how you create your streaming file but only how you would access it.
All the magic happens within the client (ex: mediaplayer, videoview) supported on 4.0 and 3.2. For the record, you may be able to access and run streaming segments (.m3u8 files) on earlier versions of Android because the manufactures have sometimes played around with the code. But I haven't found any that actually adapt. They usually stick to the first segment they run or default to the lower bitrate segment in the bunch and stay there regardless of bitrate.