gang:
I'm an embedded software engineer starting to work on starting to work with Linux streaming solutions.
And got no experience with network and smartphone OSes.
When we stream video to PC(receiving via VLC) on PC side, we found there's a latency of 1.2 seconds for H.264. It includes
sensor grabs data
data is sent over the network
VLC buffers and plays
We found out after a while, that there's buffering control on VLC. For H.264 streaming, however, the minimum we can set is 400 -- 500 ms. However, on Android phones, we were NOT able to find some software that has very short(minimum) delays/buffering.
Can anyone suggest
How is latency generally measured/profiled for video streaming to smart phones?
Do you have any network sniffing software on Android/iOS to recommend?
I saw in Apple's documentation that HTTP live streaming recommends 10s "file size". Anyway to overcome this? (Is jailbreaking required for installing sniffing tool on iOS?)
Related
I am using Opentok SDK for video calling in IOS and Android devices with Nodejs server.
It is a group call scenario with max 4 people, when we stream for more than 10 min, both the devices getting too hot.
Does anyone have solution for this?
We can't degrade the video quality.
This is likely because you are using the default video code, VP8, which is not hardware accelerated. You can change the codec per publisher to either H.264 or VP8, but there are some trade-offs to this approach.
Their lack of H.264 SVC support is disappointing, but might be okay depending on your use case. If you read this whole post and still want more guidance, I'd recommend reaching out to their developer support team, and/or post more about your use case here.
Here's some more context from the OpenTok Documentation, but I recommend you read the whole page to understand where you need to make compromises:
The VP8 real-time video codec is a software codec. It can work well at lower bitrates and is a mature video codec in the context of WebRTC. As a software codec it can be instantiated as many times as is needed by the application within the limits of memory and CPU. The VP8 codec supports the OpenTok Scalable Video feature, which means it works well in large sessions with supported browsers and devices.
The H.264 real-time video codec is available in both hardware and software forms depending on the device. It is a relatively new codec in the context of WebRTC although it has a long history for streaming movies and video clips over the internet. Hardware codec support means that the core CPU of the device doesn’t have to work as hard to process the video, resulting in reduced CPU load. The number of hardware instances is device-dependent with iOS having the best support. Given that H.264 is a new codec for WebRTC and each device may have a different implementation, the quality can vary. As such, H.264 may not perform as well at lower bit-rates when compared to VP8. H.264 is not well suited to large sessions since it does not support the OpenTok Scalable Video feature.
My goal is to stream the audio/video content in real time from Android mobile to Wowza Server.
1- As I understand, the protocol for the Wowza's incoming and outgoing(broadcasted) streams can be different. True?
2- If that is so, then I can either upload my video data through HLS protocol or RTMP protocol.
3- I'm little bit familiar with these protocols after searching here and there, but I don't have enough knowledge to make a decision which protocol will be best to stream recorded audio and video data to Wowza server in real time, so that it can be broadcasted and seen through HLS protocol. So help in selecting the best protocol will be appreciated.
It should be noted that the video will be recorded and streamed in real time, that means I will have to get the encoded buffer from MediaCodec and send it to the Wowza Server.
Again, if the point 1 above is true then it means my only concern should be to choose the best protocol for the Wowza's inbound stream.
Any advice regarding Android OS versions, SDK vs. NDK etc. will be highly appreciated.
Thanks.
1 - True.
2 - RTMP. HLS is a pull base protocol over HTTP. fine for server to client, but not great for client to server.
3 - This is not a question.
Protocol it completely independent of OS choice.
Working on Android 4.0+ above.
I am in process of analyzing ways to live stream my camera video to Window PC using RTP , encoding MPEG-2.
Is there readily available "rtp-server" in android 4.0+ ?
Is following true:: "The Android platform lacks support for
streaming protocol, which makes it difficult to stream live audio /
video to Android enabled devices." extracted from website
Currently I analyzed used the ffserver from the ffmpeg
libraries, but the FPS is < 5. which is far slow. Did any one
explored other solution which has more FPS?
Did anybody tried using StageFright for same? Capturing raw data
from camera and sending it to stagefright framework for encoding and
then streaming the same using RTP ??
Many Thanks.
The answers to your questions are as below. Though the links are related to Android 4.2.2, the same is true for Android 4.0 also.
Yes, there is a RTP transmitter available. You could look at this example in MyTransmitter as a starting point or you can consider using the standard recorder as in startRTPRecording.
You can stream data via RTP from an Android device to an external sink or you could have a different use-case as in Miracast a.k.a. Wi-Fi Display. However, streaming from one android device to another device through Wi-Fi Direct is still not completely enabled. The latter statement is mainly coming from Miracast scenario.
You can use the standard android software, which is capable of high resolution recording and transmission. This is mainly dependent on the underlying hardware as the overhead from software stack is not very high.
Yes. This is already answered in Q1 above.
I want to know is it mandatory to use any of the streaming servers like Darwin,Wowza or VLC to stream an RTSP live stream video? I am receiving an RTSP link from my client and it tends to change everytime. I can successfully play it in the VLC player but on phone I cant see anything. I tried playing a sample link having .3gp extension and it worked fine. But my links dont have an extension. They look like this rtsp://122.166.229.151:1950/1346a0cf0ef7c2. Please help me.If its compulsory to use an extension or a server, I will continue working in that direction.
A streaming server (as you describe) isn't strictly necessary - as long as you can pull RTSP from whatever your source is, you should be able to see it. Most IP cameras have onboard RTSP servers (although I wouldn't put too many connections on it). If you can see it in VLC, the phone should be able to consume it as well, given that the codec used to encode is one supported by the android device (in most cases, if you're doing H.264 Baseline 3.0 with AAC, you should be good to go).
A streaming server like Wowza can make that stream available to a wider audience than pulling directly from the source device, but if you're not intending to broadcast to a wide audience, it's not required for streaming to Android devices.
Newer versions of Android (Gingerbread and later) are also able to consume Apple HTTP Live Streaming.
I am new to video streaming and am working on a project to broadcast video to android phone over internet, and users to view the video at the same time may reach 100.
After looking around for a while I think using rtsp streaming for the phone client may be convenient(Am I right?) and so I have to choose a server, My current choice will be using
VLC
Darwin Streaming Server
Are they suitable? Or any other better choice?
How about the performance of these two servers while 100 users accessing at the same time.
Thanks in advance
Regards
Bolton
RTSP streaming in H.264/AAC would be the most convenient way to reach Android devices. End-users will not need to install an app or open the stream in one - the native media player will seamlessly open the stream.
If you intend on using VLC for the encoding portion - you may want to reconsider, as I'm not sure it supports H.264/AAC compression, which is required to reach Android devices. You may want to consider using commercial software like Wirecast or the free Flash Media Encoder with the AAC plugin.
Darwin Streaming Server is stable enough to handle that load (100 concurrent viewers), however the amount of throughput you have available and the bit-rate you will be broadcasting at are more important factors to consider when delivering video. In other words - your upload speed has to be able to be sufficient. If it's not intended strictly as a DIY project, I would suggest tapping into a commercial CDN's network (I would recommend NetroMedia).