I want to capture the audio wave frame from the audio buffer, I found android.media.audiofx.Visualizer can do such thing, but it can only returns partial and low quality audio content
I found android.media.audiofx.Visualizer will call to the function Visualizer_command(VISUALIZER_CMD_CAPTURE) at android4.0\frameworks\base\media\libeffects\visualizer
I found the function Visualizer_process will make the audio content to low quality. I want to rewrite the Visualizer_process , and want to find who will call Visualizer_process, but I cannot find the caller from Android source code, can anyone help me ?
thanks very much!
The AudioFlinger::PlaybackThread::threadLoop calls AudioFlinger::EffectChain::process_l, which calls AudioFlinger::EffectModule::process, which finally calls the actual effect's process function.
As you can see in AudioFlinger::EffectModule::process, there's the call
int ret = (*mEffectInterface)->process(mEffectInterface,
&mConfig.inputCfg.buffer,
&mConfig.outputCfg.buffer);
mEffectInterface is an effect_handle_t, which is an effect_interface_s**. The effect_interface_s struct (defined here) contains a number of function pointers (process, command, ...). These are filled out with pointers the actual effect's functions when the effect is loaded. The effects provide these pointers through a struct (in EffectVisualizer it's gVisualizerInterface).
Note that the exact location of these functions may differ between different Android releases. So if you're looking at Android 4.0 you might find some of them in AudioFlinger.cpp (or somewhere else).
Related
Consider using libVLC for Android, based on the official recommended way.
I went through the compilation process without problems (but took some time).
I'd like to have the snapshot functionality, but I've found some very old (2-3 years old) threads around which states that this feature is still not available (2016) at least "not out of the box' by this thread (2014).
Snapshot functionality is available on other platforms.
Also there are some solutions where they switch from SurfaceView to TextureView.
However I prefer sticking to SurfaceView as TextureView brings some performance drawbacks (according to this topic).
Also on an official android page it's stated:
In API 24 and higher, it's recommended to implement SurfaceView instead of TextureView.
In 2014 there were only 2 dependecies of the snapshot function based on the thread I've mentioned earlier:
enabling sout module
enabling png as encoder
When looking the "VLC-Android" repository of VideoLAN, there is a file responsible for building libVLC.
In line 396, sout module seems to be enabled by default.
Before compilation I've enabled png as encoder in vlc/contrib/src/ffmpeg/rules.mak as pointed out in the forum.
However there is still no function related to snapshot in either org.videolan.libvlc.MediaPlayer or in org.videolan.libvlc.VLCVideoLayout.
The question is how can I create a snapshot (either into file, or into buffer) on Android with libVLC, without using TextureView?
Update1:
Fact1:
Found the reason on why it's unavailable on Android. In VLC's core source tree, in file lib/video.c on line 145 there is the snapshot function with a massive FIXME warning:
/* FIXME: This is not atomic. All parameters should be passed at once
* (obviously _not_ with var_*()). Also, the libvlc object should not be
* used for the callbacks: that breaks badly if there are concurrent
* media players in the instance. */
var_Create( p_vout, "snapshot-width", VLC_VAR_INTEGER );
var_SetInteger( p_vout, "snapshot-width", i_width);
var_Create( p_vout, "snapshot-height", VLC_VAR_INTEGER );
var_SetInteger( p_vout, "snapshot-height", i_height );
var_Create( p_vout, "snapshot-path", VLC_VAR_STRING );
var_SetString( p_vout, "snapshot-path", psz_filepath );
var_Create( p_vout, "snapshot-format", VLC_VAR_STRING );
var_SetString( p_vout, "snapshot-format", "png" );
var_TriggerCallback( p_vout, "video-snapshot" );
vlc_object_release( p_vout );
Fact2:
I wanted to go to another direction with this. If snapshot function is not usable (and also not wise to use it), I thought of some emergency solutions:
there is a video-filter in VLC named scene. This produce still images of the video to a specific path. I tried using this, but video-filters are not able to change at runtime. So this attempt failed.
I also tried to do it from the MediaPlayer (via Media.addOption), but video filters are also not possible to change at MediaPlayer level on Android.
I tried then to pass the filter config as an argument to libVLC initialization and finally it succeeded, however that won't be effective to create a new libVLC instance everytime when I need a screenshot.
A few ways to go about this...
Here's a crossplatform thumbnailer example using libvlc https://code.videolan.org/mfkl/libvlcsharp-samples/blob/master/PreviewThumbnailExtractor.Skia/Program.cs It should work on Android without much editing since it doesn't use any OS specific feature in the script. Should be able to translate it to Java/Kotlin as well I guess.
There is a libvlc function that allows to take snapshot. Just go the time you want and call it. https://www.videolan.org/developers/vlc/doc/doxygen/html/group__libvlc__video.html#ga9b0a3870ce962aa0358050b2d5a59143
In VLC Android, the medialibrary now manages thumbnails.
LibVLC 4 now bundles a thumbnailer https://github.com/videolan/vlc/blob/d40eb012b10cc355ea9ad7a13eaf494b8e826d78/include/vlc/libvlc_media.h#L845
Good luck.
I am not very experienced building Android apps and I am trying to make a small app using ExoPlayer. So hopefully you guys can pardon my ignorance. I am essentially trying to see if there is a way to get access to the buffered files. I searched around, but there doesn't seem to be an answer for this. I saw people talking about cacheDataSource, but then I thought, isn't the data already being cache by virtue of it buffering? For instance, when a video starts, it start buffering. I t continues to do so even if pause is pressed. If I am understanding this correctly, the video actually plays from the buffered data. I assume that this data must be stored somewhere. Is this cache data in this case? if not, then what is cache data? what is the difference here? and finally, how can I actually get access to whatever this is? I'v been trying to see where its being stored as and as what(meaning some kind of file may be), and I reached the DefaultAllocator class, which seems to have this line
availableAllocations[i] = new Allocation(initialAllocationBlock,allocationOffset);//is this it??
this is in the DefaultAllocator.java file. Not sure if im looking in the right place...
I am not able to make sense of what the buffer even is and how its stored. Youtube stores .exo files. I can see a cache folder in data/data/myAppName/cache by printing the getCacheDir(), but that seems to be giving out some java.io.fileAndSomeRandomChars. The buffer also gets deleted when the player is minimized or another app is opened.
Does the ExoPlayer also store files in chunks?
Any insight on this would be seriously super helpful!. Iv been stuck on this for a few days now. Super duper appreciate it!
Buffers are not files, buffers are stored in application memory, and in this example they are instances of ByteBuffer class. ExoPlayer buffers are passed through instances of MediaCodecRenderer using processOutputBuffer() method.
Buffers are usually arrays of bytes or maybe some other kind of data, while ByteBuffer class adds some helpfull methods around it for tracking size of buffer ot its last accessed position using marker and so on.
The way how I access buffers is by extending the implementation of renderer that I am using and then override processOutputBuffer() like this:
public class CustomMediaCodecAudioRenderer extends MediaCodecAudioRenderer
{
#Override
protected boolean processOutputBuffer( long positionUs, long elapsedRealtimeUs, MediaCodec codec, ByteBuffer buffer, int bufferIndex, int bufferFlags, long bufferPresentationTimeUs, boolean shouldSkip ) throws ExoPlaybackException
{
boolean fullyProcessed;
//Here you use the buffer
doSomethingWithBuffer( buffer );
//Here we allow renderer to do its normal stuff
fullyProcessed = super.processOutputBuffer( positionUs,
elapsedRealtimeUs,
codec,
buffer,
bufferIndex,
bufferFlags,
bufferPresentationTimeUs,
shouldSkip );
return fullyProcessed;
}
}
The video decoding code of an app is typical, just like the example code in the MediaCodec document. Nothing special. The configuration statement is like the following:
myMediaCodec.configure(myMediaFormat, mySurface, null, 0);
Everything works fine. However, if I change the above code to the following to decode the video to a buffer instead of a surface:
myMediaCodec.configure(myMediaFormat, null, null, 0);
then the following code:
int iOutputBufferIndex = myMediaCodec.dequeueOutputBuffer(myBufferInfo, 100000);
will always return MediaCodec.INFO_TRY_AGAIN_LATER. Even more strangly, any subsequent call of myMediaCodec.stop() or myMediaCodec.release() will hang (i.e. the call never returns or generates an exception).
This happens on a generic (AGPTek) tablet (Allwinner A31S, 1.5GHz Cortex A7 Quad Core). On a simulator and another tablet (Asus Memo Pad), everything works fine.
I am asking for any tip to help get around this problem.
Do you provide one single input buffer worth of data before trying this, or do you pass as many packets as you can before dequeueInputBuffer also blocks or returns INFO_TRY_AGAIN_LATER? A decoder might not output data after only one packet of input (if the decoder has got some delay), but if it works with Suface output it should probably behave in the same way there.
If that (queueing as many input buffers as possible) doesn't work, I would say that this sounds like a decoder bug.
I'm using AudioManager.adjustSuggestedStreamVolume(int direction, int suggestedStreamType, int flags) with suggestedStreamType set to AudioManager.USE_DEFAULT_STREAM_TYPE. What this does is it decides what stream type (STREAM_RING, STREAM_MEDIA etc) is the most relevant for a given context (my app) and adjusts its volume accordingly. My question is, is there a way to "resolve" what exactly is the most relevant stream in a given context?
I've been searching through the Android source code, there is a method AudioService.getActiveStreamType(int suggestedStreamType) which does exactly what I want, but it's marked private so I can't use it in my app.
Any ideas?
If this is all for adjusting the stream volume in your app, you're best off manually setting it during creation with:
activity.setVolumeControlStream(AudioManager.STREAM_MUSIC);
(Or any other stream type)
and modifying that appropriate stream therein.
I have some audio data (raw AAC) inside a byte array for playback. During playback, I need to get its volume/amplitude to draw (something like an audio wave when playing).
What I'm thinking now is to get the volume/amplitude of the current audio every 200 milliseconds and use that for drawing (using a canvas), but I'm not sure how to do that.
.
.
.
.
** 2011/07/13 add following **
Sorry just been delayed on other project until now.
What I tried is run the following codes in a thread, and playing my AAC audio.
a loop
{
// int v=audio.getStreamVolume(AudioManager.MODE_NORMAL);
// int v=audio.getStreamVolume(AudioManager.STREAM_MUSIC);
int v=audio.getStreamVolume(AudioManager.STREAM_DTMF);
// Tried 3 settings above
Log.i(HiCardConstants.TAG, "Volume - "+v);
try{Thread.sleep(200);}
catch(InterruptedException ie){}
}
But only get a fixed value, not dynamic volume...
And I also found a class named Visualizer, but unfortunately, my target platform is Android 2.2 ... :-(
Any suggestions are welcome :-)
After days and nights, I found that an Android app project called ringdroid
can solve my problem.
It helps me to get an audio gain value array, so that I can use to to draw my sound wave.
BTW, as my experience, some .AMR or .MP3 can't be parsed correctly, due to too low bitrate...