How Can Compress Audio In Call Recorder Android - android

I Used This Code Capture Audio In Android Studio
But Size Audio Its Big(1min=1MB)
How Can Compress Audio Without Quality loss
AudioRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
AudioRecorder.setOutputFormat(MediaRecorder.OutputFormat.AAC_ADTS);
AudioRecorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_NB);
AudioRecorder.setAudioChannels(1);
AudioRecorder.setAudioEncodingBitRate(44100);
AudioRecorder.setAudioSamplingRate(128000);

You're setting the encoder to AMR_N8. About 1 MB/sec sounds about right for that encoder. If you want to compress it you can choose another encoding, such as Vorbis.
Please note that the different encoders have different purposes. None of them will compress without quality loss- we're converting analog sound to digital values, there is always quality loss with that. Different encoders are optimized for different uses. AMR is optimized for human voice. It filters out frequencies not in vocal range. That's a quality loss, but it may be one you want (like in a call). You can't get everything- you're going to have to sacrifice size or quality. I suggest you study up on the different encodings found at https://developer.android.com/reference/android/media/MediaRecorder.AudioEncoder.html and figure out what's best for you.

Related

android exoplayer video with high resolution not playing and buffering well why?

I have android tv application playing 2 different videos udp streaming using Exoplayer, the below images(image1 and image2) show the specifications of every video .. First one with low resolution play well without any problem in buffering, the second one have bad buffering and streaming.
image 1 : the first video with lower resolution (1280x720) which is playing well without any freezing or problem
image 2 : the second video with high resolution (1920x1080) play with freezing, or incontinious buffering
Below is my exoplayer initialisation
ExoPlayer.Builder playerbuilder = new ExoPlayer.Builder(WelcomeActivity.this);
LoadControl loadControl = new DefaultLoadControl.Builder()
.setBufferDurationsMs(15000, 50000, 2000, 5000)
.setTargetBufferBytes(DefaultLoadControl.DEFAULT_TARGET_BUFFER_BYTES)
.setAllocator(new DefaultAllocator(true, C.DEFAULT_BUFFER_SEGMENT_SIZE))
.setPrioritizeTimeOverSizeThresholds(DefaultLoadControl.DEFAULT_PRIORITIZE_TIME_OVER_SIZE_THRESHOLDS)
.build();
player = playerbuilder .setLoadControl(loadControl).setRenderersFactory(buildRenderersFactory(WelcomeActivity.this)).build();
player.setTrackSelectionParameters(
player.getTrackSelectionParameters()
.buildUpon()
.setMaxVideoSize(1280, 720)
.setPreferredTextLanguage("ar")
.build());
As you notice i am setting the maximum video size to low resolution .setMaxVideoSize(1280, 720) but this not changing any thing regards the bad buffering of the second video with high resolution
The size and bit rate of an encoded video is determined by how it is encoded - e.g. what codec is used and what parameters and configuration the encoding uses.
Nearly all encoders try to reduce the size of the encoded video, to make it easier to store and to transmit, and there is a balance between the size reduction and the visual quality of the video when it is subsequently decoded and played.
Generally, smaller size/bit rate means less quality, but it does depend on the content itself - for example some cartoons are very tolerant of lower bit rates. More advanced commercial encoders can do 'context aware' encoding and adjust the quality and bit rate depending on the scenes.
The most common codec at this time is the h.264 codec and ffmpeg is probably the most common open source tool for encoding.
ffmpeg provides a guide for encoding videos, or re-encoding / transcoding videos with h.264 and this includes notes and examples on quality vs bit rate trade off:
https://trac.ffmpeg.org/wiki/Encode/H.264
The two key sections in the above are for Constant Rate Factor encoding and two pass encoding.
CRF
allows the encoder to attempt to achieve a certain output quality for the whole file when output file size is of less importance. This provides maximum compression efficiency with a single pass
Two Pass
Use this rate control mode if you are targeting a specific output file size, and if output quality from frame to frame is of less importance
There is also a note on Constrained encoding (VBV / maximum bit rate) which it would be good to look at too.
I think you might be best to start with CRF encoding and experiment with the CRF value to see if you can find a quality/bit rate combination you are happy with.
Finally, it would be worth you checking out ABR streaming also, especially if you plan to host many videos and want your users to have consistent experience. For ABR you create multiple copies of the video, each encoded with different resolution/bit rate.
The client device or player downloads the video in chunks, e.g 10 second chunks, and selects the next chunk from the bit rate most appropriate to the current network conditions. See some more info in this answer also: https://stackoverflow.com/a/42365034/334402

WebM with VP9 vs MP4 with H.264 AVC which one is best overall

I have used VideoView and load MP4 file of length 1 minute.
The problem is it start's after delay.
I want it to start immediately so which codec and byte rate to choose.
Share your experience if any one related to video codec.
I want to see comparison between these two types.
Loading speed, length and file size and quality ratio
MP4 usually has all index tables at the end of the file, so it may require to scan the whole file on the disk in order to start playback.
You may convert into MP4 file, optimized for streaming, so that tables are at the beginning.
MPEG TS (Transport stream) also is loaded quickly.
Probably Webm will load faster than "standard" MP4, but I am not so familiar with Webm format.
All PCs and smartphones have hardware based AVC (H.264) video decoder. VP9 is mostly decoded in software. So presumably, AVC will be easier to decode for your computer.
Quality or size of VP9 can be better than AVC only if you use HD. On smaller videos quality should be more or less equal.
There are many useful tools to encode AVC, and not so much for VP9. Using ffmpeg and proper settings like 2-pass encoding you can compress AVC harder than VP9.
So I recommend to use AVC, and optimized MP4.

Creating small short audio files with sox for android

I create around 1000 audio files via sox for my android application, each containing a recording of a word. To safe space I want to keep the file size as small as possible.
Should I use .mp3 or .ogg? Which settings should I use?
Have you checked that question on SO Smallest audio file: MP3, Ogg, or Wav? ?
#keyboardP said:
Of those three, Ogg would usually be smaller than MP3. Both would be
much smaller than the uncompressed WAV. Of course, there may be other
factors that come into play for your site such as quality (not too
much of a noticeable difference for most purposes) and browser support
for each type.
The file size will only affect the time it takes to download the file
to the user's machine. It won't necessarily determine Javascript
execution speed. There may be other things in your code causing the
performance drops (unless you've narrowed it down to the file size of
the audio files).
If little loss in quality doesn't affect your application than using audio-grabber to decrease the bit-rate of .ogg files will give you amazingly small audio files.
Since Android supports both of the codecs natively I would definetly choose the Vorbis codec. At low bitrates the Vorbis codec produces a much higher clarity than MP3 and even the file size is smaller.
In general I would recommend to encode the sound files with the aoTuV encoder (which is a third-party development of the official Vorbis encoder which further improves low bit-rate quality) in quality level 1 (approximately 80 kbps).

Power efficient video streaming from an Android device

I'm doing some experiments with video streaming from and the front camera of the android device to a local server. Currently I plan to use WiFi. I may move to Bluetooth 4.0 in the future.
I'm looking for insights, experience and DOs and DON'Ts and other ideas that I should consider in relation to protocol options (TCP, UDP, ...? ) and video codec. The image quality should be good enough to run computer vision algorithms such as face and object detection, recognition and tracking on the server side. The biggest concern is power. I want to make sure that the streaming is as power efficient as possible. I understand more power efficiency means a lower frame rate.
Also, I need to way to just send the video frames without displaying them directly on the screen.
Thanks.
You didn't mention whether you will be doing encoding or decoding on the device.
Some tips:
UDP will be less power hungry in general especially under deteriorating network conditions:
See http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.134.5517&rep=rep1&type=pdf
Check for more papers on this on google
In terms of codecs in general you can say the order is H264 > MPEG4 > H.263 in terms of power needed for both encoding and decoding.
Higher the bitrate more the power needed for decoding but the codec difference is a bigger difference than the bitrate one. I say this because to get same quality as a H.264 stream with H.263 you need higher bitrate. But h.263 at that bitrate should consume lesser power than H.264 at the lower bitrate. So do not apply it cross codec. Just at the codec chosen use the lowest bitrate/framerate you can.
In encoding though very low bitrates can make the encoder work harder so increase power consumption. So encoding bitrates should be low, but not so low that the encoder is streched. This means choosing a reasonable bitrate which does not produce a continous blocky stream but gives a decent stream output.
Within each codec if you can control the encoding then you can also control the decoding power. The following applies to both:
i.e. Deblocking, B Pictures will add to power requirements. Keeping to lower profiles [Baseline for H.264, Simple Profile for MPEG4 and Baseline for H.263] will result in lesser power requirements in encoding and decoding. In MPEG4 switch off 4MV support if you can. Makes streams even simpler to decode. Remember each of these also have a quality impact so you have to find what is acceptable quality.
Also unless you can really measure the power consumption I am not sure you need very fine tweaking of the toolsets. Just sticking to the lower profiles should suffice.
Worse the video quality during capture more the power needed during encoding. So bright lighted videos need lesser effort to encode, low light videos need more power.
There is no need to send videos to a screen. you receive video over a socket and do whatever you want to do with that data. That is upto you. You do not have to decode and display it.
EDIT: Adding a few more things I could think off
In general the choice of codec and its profile will be the biggest thing affecting a video encoding/decoding system's power consumption.
The biggest difference may come from the device configuration. If you have hardware accelerators for a particular codec in the device it may be cheaper to use those than software codec for another one. So though H.264 may require more power than MPEG4 when both are in software, if the device has H.264 in hardware then it may be cheaper than MPEG4 in software. So check you device hardware capability.
Also video resolution matters. Smaller videos are cheaper to encode. You can clock your device at lower speeds when running smaller resolutions.

For android media player mp3 vs. wav

I want to know if it is faster to load and play a small wav than a small mp3 file on android media player. The wavs are about 30 KB and the same files as mp3s are about 20 kb. The mp3s have the advantage to save resource space. The sound files have to be played with split second timing.
For such small sounds, you will get best results with SoundPool.
Even the weakest android devices have ample computing power to play an mp3, and probably have hardware acceleration for it as well. The real question is the setup overhead for playing a wav vs. playing an mp3, which should be fairly easy to measure programmatically.
I'm a little surprised you're getting such a poor compression ratio with mp3. Even lossless compression algorithms tend to get a 2:1 compression ratio with wav. Given that an android device probably isn't hooked up to audiophile-quality speakers, you should be able to get away with 64 kbit/s mono mp3 compression, or even lower. If you can get the file size under 4K, it'll fit in a single memory page, which is about as low as you can get for OS overhead.
If for whatever reason you're stuck with a 1.5:1 compression ratio, it's probably not worth the extra work.
Wav files use more space because they have a higher sample rate. Pretty much more points that the sound wave will trace out so in theory it would take more processing power to play a wav. Also wave is uncompressed meaning it has all of the information from the source it was taken from. When you take a cd and convert it to wav you more or less have a copy of the original. When you convert to mp3 it uses fewer reference points and detail is lost. Secondly, most mp3 encoders normalize the music which is a fancy way of saying it makes the quiet parts louder and the loud parts quieter. All this being said some people cant hear the difference and it mostly depends on what type of headphones/speakers you are listening on... ALLL that being said there shouldn't be a delay on either format the only difference should be the sample rate or "resolution" of the sound file
I have no technical "stuff" to back me up here, but since no one else has taken a crack at this, I will.
I know that mp3s have "better" compression than wavs, thus the file is smaller. This would imply, however, that it would take more cpu to "uncompress" the files. (This may be done on dedicated hardware so it could be a moot point.) Additionally, since the files will be inflated, it may be deceiving to see the mp3 file's smaller size and think it would be quicker to load and play.
Considering the wav file format's history, and that it serves as a 'lowest common denominator' when it comes to exchanging sound files between different programs (per Wikipedia), I would make an educated guess that it would be faster to load and play a small wav file. This is very dependent on Android's software implementation of audio libraries as well as the hardware so if anyone knows more, it would be great to hear their take.

Categories

Resources