Power efficient video streaming from an Android device - android

I'm doing some experiments with video streaming from and the front camera of the android device to a local server. Currently I plan to use WiFi. I may move to Bluetooth 4.0 in the future.
I'm looking for insights, experience and DOs and DON'Ts and other ideas that I should consider in relation to protocol options (TCP, UDP, ...? ) and video codec. The image quality should be good enough to run computer vision algorithms such as face and object detection, recognition and tracking on the server side. The biggest concern is power. I want to make sure that the streaming is as power efficient as possible. I understand more power efficiency means a lower frame rate.
Also, I need to way to just send the video frames without displaying them directly on the screen.
Thanks.

You didn't mention whether you will be doing encoding or decoding on the device.
Some tips:
UDP will be less power hungry in general especially under deteriorating network conditions:
See http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.134.5517&rep=rep1&type=pdf
Check for more papers on this on google
In terms of codecs in general you can say the order is H264 > MPEG4 > H.263 in terms of power needed for both encoding and decoding.
Higher the bitrate more the power needed for decoding but the codec difference is a bigger difference than the bitrate one. I say this because to get same quality as a H.264 stream with H.263 you need higher bitrate. But h.263 at that bitrate should consume lesser power than H.264 at the lower bitrate. So do not apply it cross codec. Just at the codec chosen use the lowest bitrate/framerate you can.
In encoding though very low bitrates can make the encoder work harder so increase power consumption. So encoding bitrates should be low, but not so low that the encoder is streched. This means choosing a reasonable bitrate which does not produce a continous blocky stream but gives a decent stream output.
Within each codec if you can control the encoding then you can also control the decoding power. The following applies to both:
i.e. Deblocking, B Pictures will add to power requirements. Keeping to lower profiles [Baseline for H.264, Simple Profile for MPEG4 and Baseline for H.263] will result in lesser power requirements in encoding and decoding. In MPEG4 switch off 4MV support if you can. Makes streams even simpler to decode. Remember each of these also have a quality impact so you have to find what is acceptable quality.
Also unless you can really measure the power consumption I am not sure you need very fine tweaking of the toolsets. Just sticking to the lower profiles should suffice.
Worse the video quality during capture more the power needed during encoding. So bright lighted videos need lesser effort to encode, low light videos need more power.
There is no need to send videos to a screen. you receive video over a socket and do whatever you want to do with that data. That is upto you. You do not have to decode and display it.
EDIT: Adding a few more things I could think off
In general the choice of codec and its profile will be the biggest thing affecting a video encoding/decoding system's power consumption.
The biggest difference may come from the device configuration. If you have hardware accelerators for a particular codec in the device it may be cheaper to use those than software codec for another one. So though H.264 may require more power than MPEG4 when both are in software, if the device has H.264 in hardware then it may be cheaper than MPEG4 in software. So check you device hardware capability.
Also video resolution matters. Smaller videos are cheaper to encode. You can clock your device at lower speeds when running smaller resolutions.

Related

android exoplayer video with high resolution not playing and buffering well why?

I have android tv application playing 2 different videos udp streaming using Exoplayer, the below images(image1 and image2) show the specifications of every video .. First one with low resolution play well without any problem in buffering, the second one have bad buffering and streaming.
image 1 : the first video with lower resolution (1280x720) which is playing well without any freezing or problem
image 2 : the second video with high resolution (1920x1080) play with freezing, or incontinious buffering
Below is my exoplayer initialisation
ExoPlayer.Builder playerbuilder = new ExoPlayer.Builder(WelcomeActivity.this);
LoadControl loadControl = new DefaultLoadControl.Builder()
.setBufferDurationsMs(15000, 50000, 2000, 5000)
.setTargetBufferBytes(DefaultLoadControl.DEFAULT_TARGET_BUFFER_BYTES)
.setAllocator(new DefaultAllocator(true, C.DEFAULT_BUFFER_SEGMENT_SIZE))
.setPrioritizeTimeOverSizeThresholds(DefaultLoadControl.DEFAULT_PRIORITIZE_TIME_OVER_SIZE_THRESHOLDS)
.build();
player = playerbuilder .setLoadControl(loadControl).setRenderersFactory(buildRenderersFactory(WelcomeActivity.this)).build();
player.setTrackSelectionParameters(
player.getTrackSelectionParameters()
.buildUpon()
.setMaxVideoSize(1280, 720)
.setPreferredTextLanguage("ar")
.build());
As you notice i am setting the maximum video size to low resolution .setMaxVideoSize(1280, 720) but this not changing any thing regards the bad buffering of the second video with high resolution
The size and bit rate of an encoded video is determined by how it is encoded - e.g. what codec is used and what parameters and configuration the encoding uses.
Nearly all encoders try to reduce the size of the encoded video, to make it easier to store and to transmit, and there is a balance between the size reduction and the visual quality of the video when it is subsequently decoded and played.
Generally, smaller size/bit rate means less quality, but it does depend on the content itself - for example some cartoons are very tolerant of lower bit rates. More advanced commercial encoders can do 'context aware' encoding and adjust the quality and bit rate depending on the scenes.
The most common codec at this time is the h.264 codec and ffmpeg is probably the most common open source tool for encoding.
ffmpeg provides a guide for encoding videos, or re-encoding / transcoding videos with h.264 and this includes notes and examples on quality vs bit rate trade off:
https://trac.ffmpeg.org/wiki/Encode/H.264
The two key sections in the above are for Constant Rate Factor encoding and two pass encoding.
CRF
allows the encoder to attempt to achieve a certain output quality for the whole file when output file size is of less importance. This provides maximum compression efficiency with a single pass
Two Pass
Use this rate control mode if you are targeting a specific output file size, and if output quality from frame to frame is of less importance
There is also a note on Constrained encoding (VBV / maximum bit rate) which it would be good to look at too.
I think you might be best to start with CRF encoding and experiment with the CRF value to see if you can find a quality/bit rate combination you are happy with.
Finally, it would be worth you checking out ABR streaming also, especially if you plan to host many videos and want your users to have consistent experience. For ABR you create multiple copies of the video, each encoded with different resolution/bit rate.
The client device or player downloads the video in chunks, e.g 10 second chunks, and selects the next chunk from the bit rate most appropriate to the current network conditions. See some more info in this answer also: https://stackoverflow.com/a/42365034/334402

Opentok SDK making Android and iOS devices too hot

I am using Opentok SDK for video calling in IOS and Android devices with Nodejs server.
It is a group call scenario with max 4 people, when we stream for more than 10 min, both the devices getting too hot.
Does anyone have solution for this?
We can't degrade the video quality.
This is likely because you are using the default video code, VP8, which is not hardware accelerated. You can change the codec per publisher to either H.264 or VP8, but there are some trade-offs to this approach.
Their lack of H.264 SVC support is disappointing, but might be okay depending on your use case. If you read this whole post and still want more guidance, I'd recommend reaching out to their developer support team, and/or post more about your use case here.
Here's some more context from the OpenTok Documentation, but I recommend you read the whole page to understand where you need to make compromises:
The VP8 real-time video codec is a software codec. It can work well at lower bitrates and is a mature video codec in the context of WebRTC. As a software codec it can be instantiated as many times as is needed by the application within the limits of memory and CPU. The VP8 codec supports the OpenTok Scalable Video feature, which means it works well in large sessions with supported browsers and devices.
The H.264 real-time video codec is available in both hardware and software forms depending on the device. It is a relatively new codec in the context of WebRTC although it has a long history for streaming movies and video clips over the internet. Hardware codec support means that the core CPU of the device doesn’t have to work as hard to process the video, resulting in reduced CPU load. The number of hardware instances is device-dependent with iOS having the best support. Given that H.264 is a new codec for WebRTC and each device may have a different implementation, the quality can vary. As such, H.264 may not perform as well at lower bit-rates when compared to VP8. H.264 is not well suited to large sessions since it does not support the OpenTok Scalable Video feature.

How Can Compress Audio In Call Recorder Android

I Used This Code Capture Audio In Android Studio
But Size Audio Its Big(1min=1MB)
How Can Compress Audio Without Quality loss
AudioRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
AudioRecorder.setOutputFormat(MediaRecorder.OutputFormat.AAC_ADTS);
AudioRecorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_NB);
AudioRecorder.setAudioChannels(1);
AudioRecorder.setAudioEncodingBitRate(44100);
AudioRecorder.setAudioSamplingRate(128000);
You're setting the encoder to AMR_N8. About 1 MB/sec sounds about right for that encoder. If you want to compress it you can choose another encoding, such as Vorbis.
Please note that the different encoders have different purposes. None of them will compress without quality loss- we're converting analog sound to digital values, there is always quality loss with that. Different encoders are optimized for different uses. AMR is optimized for human voice. It filters out frequencies not in vocal range. That's a quality loss, but it may be one you want (like in a call). You can't get everything- you're going to have to sacrifice size or quality. I suggest you study up on the different encodings found at https://developer.android.com/reference/android/media/MediaRecorder.AudioEncoder.html and figure out what's best for you.

How to reduce mp3 streaming traffic comsumption for mobile use?

I want to be able to send an audio stream to Android/IOS devices.
The current encoding for the stream is mp3 128 kbps. If i'd send this over the network it will take huge amount of mobile data.
I was thinking of compressing the data with gzip but i think that would make no difference as mp3 is already a reduced file.
Is there any way to reduce the size of the stream and play it on the mobile device?
Thanks,
Dan
First off, your math is ignoring a key unit. Your MP3 stream is 128 kilobits (note the bits) per second. This comes out to be a little under 60 megabytes per hour after you factor in a little bit of overhead and metadata.
Now, as Mark said you can use a different bitrate and/or codec. For most mobile streams, I choose either a 64kbit or 96kbit stream, and then either MP3 or AAC depending on compatibility. AAC does compress a bit better, providing a better sounding stream at those low bitrates, but you will still need an MP3 stream for some devices.
Also note that you should not assume your users are using the mobile network on their mobile devices. Give your users a choice of which stream to use. Some have unlimited data and great coverage. Others use WiFi all the time.
All you can do is re-compress to a lower bit rate and use a different compression method, e.g. AAC. An AAC should sound better at the same bit rate.

How to get Android to play a high quality RTSP stream smoothly?

Does anybody have any luck streaming a high quality video (over 1000kbps) to Android through RTSP?
We currently have low quality video streams (around 200kbps) that work wonderfully over 3G. Now we are trying to serve a high-quality stream for when the user has a faster connection. The high quality videos play smoothly in VLC, but the Android playback seems to drop frames and get blocky, even on a 4 megabit connection.
It seems like the YouTube app uses a plain HTTP download for their high quality videos. This works well and plays smoothly, but will not work for streaming live videos. Has anybody had luck streaming high quality videos to Android through RTSP?
The videos are encoded using H.264, 1500kbps, 24fps, and a 720x480 resolution. In the app, we are using a VideoView to play the videos. We are using Darwin Streaming Server, but we are open to other options if necessary.
Update 6/23/2011
Looking through Darwin some more today. So far, I am just logging the request and session information in a Darwin module.
The original Droid tries to use these settings: 3GPP-Adaptation:...size=131072;target-time=4000. Although that means it wants 4 seconds of buffer, 131Kb only holds about a second of playback at 1200kbps. I understand that 1200kbps is large, but it is necessary for a high quality video (minimal compression on 720x480).
I am trying to force the client to buffer more, but I haven't figured out how to do that yet. I'm just looking through the Darwin Streaming Server source and trying to figure out how they do things. Any Darwin experts out there?
Update 6/24/2011
As it turns out, using plain old HTTP for viewing videos on demand works well with no loss of quality. When we get to live streaming, we will have to look more into RTSP.
Well even if the network is able to transmit at that rate, you still need to decode it. What are you using for decoding? You will probably need to use a NEON accelerated video decoder so you can have a proper framerate, and a decent size buffer... the graphics processor is only as good as the bus that it is in... Also what are your encoding settings and resolution?
Edit: You are encoding those at much to high bitrate, half of that will do fine. Also you need to make sure where the issue lies. Is the mediaPlayer getting the data and failing to stream at a decent framerate, in that case you have to replace the MediaPlayer code with your own player. Is it's network issue then only solution is to lower the bitrate, 600Kbps would be just fine (or 500Kbps video, 128Kbps audio), it's 3x your 200k stream and on a screen this small, the difference is not noticeable.

Categories

Resources