I'm using an extra thread for AudioTrack to play back a wave file I have previously recorded in another activity.
When I start playback, I get the following output in LogCat:
09-13 09:13:13.207: WARN/Archive(1066): audioplayThread is running..., state is: READY
09-13 09:13:21.187: WARN/Archive(1066): wave file /mnt/sdcard/myrecording/myrecording_3_12-09-2011_14-14.wav
09-13 09:13:22.582: WARN/Archive(1066): audioPlayer bufferSize 4090
09-13 09:13:22.582: WARN/Archive(1066): audioPlayer wave length 206432
09-13 09:13:22.601: DEBUG/AudioHardware(75): AudioHardware pcm playback is exiting standby.
09-13 09:13:22.601: DEBUG/AudioHardware(75): openPcmOut_l() mPcmOpenCnt: 0
09-13 09:13:22.683: WARN/AudioFlinger(75): write blocked for 82 msecs, 1 delayed writes, thread 0xcb08
09-13 09:13:24.816: WARN/Archive(1066): playbackHeadposition is: 49591
09-13 09:13:24.816: WARN/Archive(1066): audiostate is: 3
09-13 09:13:26.089: WARN/AudioTrack(1066): obtainBuffer() track 0x298b18 disabled, restarting
09-13 09:13:27.214: WARN/AudioTrack(1066): obtainBuffer() track 0x298b18 disabled, restarting
09-13 09:13:28.332: WARN/AudioTrack(1066): obtainBuffer() track 0x298b18 disabled, restarting
09-13 09:13:29.457: WARN/AudioTrack(1066): obtainBuffer() track 0x298b18 disabled, restarting
...
I just saw the logcat: obtainBuffer() track 0x298b18 disabled, restarting ..
that is to say : you have not got audio data and the input audio stream have been stopped .so the buffer must wait for the audio stream in. i think you should check the inputstream and the read method. you must ensure that you have read the audio data continuously.
I've had the same problem before. I used vitamio bundle which is a counterpart of Android's Media Player and has more audio support. Try adding this code to your manifest:
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
I hope this helped.
I have observed the same issue when number of written samples is not divisible by number of channels.
For example:
number of channels: 2
mAudioTrack.write(buffer, 0, 75);
Related
Im playing a DAI supported DASH stream with Exoplayer 2.18.1 (on firetv 4k) where the multi-period manifest gets stitched with the ad period. The ad period sometimes have dolby (stereo or 5.1) audio tracks and when the playback switches to ad from live we see some video freezes and then complete playback freeze at the end of the ad. The adb logcat at this time show following spurious audio timestamp related logs:
09-21 17:34:57.312 19192 23498 W AudioTrack: Spurious audio timestamp (frame position mismatch): 28270119, 10923744015, 10923743917, 189528000, 9827328, 9827328
09-21 17:34:57.816 19192 23498 W AudioTrack: Spurious audio timestamp (frame position mismatch): 28294325, 10924248296, 10924248198, 189528000, 9827328, 9827328
09-21 17:34:58.211 19192 19231 V SessionStateManager: 2022-09-21 17:34:58,211 - Thread: [main] - player time 1663761496000ms program start time 1663759800000ms media session time 1696000ms
09-21 17:34:58.211 19192 19231 V SessionStateManager: 2022-09-21 17:34:58,211 - Thread: [main] - com.mobitv.client.connect.core.media.session.MobiMediaSession#34d0ee4 set media session playback state to PLAYING time 1696000ms
09-21 17:34:58.211 19192 19231 V SessionStateManager: 2022-09-21 17:34:58,211 - Thread: [main] - setting playback state PlaybackState {state=3, position=1696000, buffered position=0, speed=1.0, updated=10924643, actions=8963, error code=0, error message=null, custom actions=[], active item id=-1}
09-21 17:34:58.213 739 932 V Avrcp : MediaController playback changed: PlaybackState {state=3, position=1696000, buffered position=0, speed=1.0, updated=10924643, actions=8963, custom actions=[], active item id=-1, error=null}
09-21 17:34:58.322 19192 23498 W AudioTrack: Spurious audio timestamp (frame position mismatch): 28318587, 10924753767, 10924753671, 189528000, 9827328, 9827328
One interesting thing to note is that this exact ad plays well when played individually i.e like a DASH VoD asset but not when stitched with DASH live period.
Some more information from stitched manifest that might help. The live period contains 1 video and 1 audio track:
<Representation id="L_5000_W" codecs="avc1.64001f" bandwidth="5120000" width="1280" height="720" frameRate="60/2" sar="1:1">
<SegmentTemplate initialization="$RepresentationID$/init.mp4" media="$RepresentationID$/$Time$.m4s" timescale="90000" presentationTimeOffset="150764576748000">
...
</SegmentTemplate>
</Representation>
<Representation id="L_384_ENG" codecs="ac-3" bandwidth="384000" audioSamplingRate="48000">
<AudioChannelConfiguration schemeIdUri="tag:dolby.com,2014:dash:audio_channel_configuration:2011" value="F801"/>
<SegmentTemplate initialization="$RepresentationID$/init.mp4" media="$RepresentationID$/$Time$.m4s" timescale="48000" presentationTimeOffset="80407774265537">
...
</SegmentTemplate>
</Representation>
The ad period too contains 1 video and 1 audio track:
<Representation id="A_5000_W" codecs="avc1.4d4028" bandwidth="5120000" width="1920" height="1080" frameRate="60/2" sar="1:1">
<SegmentTemplate initialization="/prod-ad-1/$RepresentationID$/init.mp4" media="/prod-ad-1/$RepresentationID$/$Number$.m4s" timescale="90000" startNumber="1">
...
</SegmentTemplate>
</Representation>
<Representation id="A_384_ENG" codecs="ac-3" bandwidth="384000" audioSamplingRate="48000">
<AudioChannelConfiguration schemeIdUri="tag:dolby.com,2014:dash:audio_channel_configuration:2011" value="F801"/>
<SegmentTemplate initialization="/prod-ad-1/$RepresentationID$/init.mp4" media="/prod-ad-1/$RepresentationID$/$Number$.m4s" timescale="48000" startNumber="1">
...
</SegmentTemplate>
</Representation>
As of now Im struggling to understand the exact reasons for this playback problem. Can any please one help understanding this issue ?
Edit: FireTV stick 4K with Fire OS 6.2.9.3 (NS6293/4731)
My app is recording audio from phone's microphones and does some real time processing on it. It's working fine on physical devices, but acts "funny" in emulator. It records something, but I'm not quite sure what it is it's recording.
It appears that on emulator the audio samples are being read at about double the rate as on actual devices. In the app I have a visual progress widget (a horizontally moving recording head), which moves about twice as fast in emulator.
Here is the recording loop:
int FREQUENCY = 44100;
int BLOCKSIZE = 110;
int bufferSize = AudioRecord.getMinBufferSize(FREQUENCY,
AudioFormat.CHANNEL_IN_STEREO, AudioFormat.ENCODING_PCM_16BIT) * 10;
AudioRecord audioRecord = new AudioRecord(MediaRecorder.AudioSource.CAMCORDER,
FREQUENCY, AudioFormat.CHANNEL_IN_STEREO, AudioFormat.ENCODING_PCM_16BIT,
bufferSize);
short[] signal = new short[BLOCKSIZE * 2]; // Times two for stereo
audioRecord.startRecording();
while (!isCancelled()) {
int bufferReadResult = audioRecord.read(signal, 0, BLOCKSIZE * 2);
if (bufferReadResult != BLOCKSIZE * 2)
throw new RuntimeException("Recorded less than BLOCKSIZE x 2 samples:"
+ bufferReadResult);
// process the `signal` array here
}
audioRecord.stop();
audioRecord.release();
The audio source is set to "CAMCORDER" and it records in stereo. The idea is, if the phone has multiple microphones, the app will process data from both and use whichever has better SNR. But I have the same problems if recording mono from AudioSource.MIC. It reads audio data in a while loop, I am assuming that audioRecord.read() is a blocking call and will not let me read same data twice.
The recorded data looks OK – the record buffer contains 16-bit PCM samples for two channels. The loop just seems to be running at twice the speed as on real devices. Which leads me to think that maybe the emulator is using a higher sampling rate than the specified 44100Hz. If I query the sample rate with audioRecord.getSampleRate() it returns the correct value.
Also there are some interesting audio related messages in logcat while recording:
07-13 12:22:02.282 1187 1531 D AudioFlinger: mixer(0xf44c0000) throttle end: throttle time(154)
(...)
07-13 12:22:02.373 1187 1817 E audio_hw_generic: Error opening input stream format 1, channel_mask 0010, sample_rate 16000
07-13 12:22:02.373 1187 3036 I AudioFlinger: AudioFlinger's thread 0xf3bc0000 ready to run
07-13 12:22:02.403 1187 3036 W AudioFlinger: RecordThread: buffer overflow
(...)
07-13 12:22:24.792 1187 3036 W AudioFlinger: RecordThread: buffer overflow
07-13 12:22:30.677 1187 3036 W AudioFlinger: RecordThread: buffer overflow
07-13 12:22:37.722 1187 3036 W AudioFlinger: RecordThread: buffer overflow
I'm using up-to-date Android Studio and Android SDK, and I have tried emulator images running API levels 21-24. My dev environment is Ubuntu 16.04
Has anybody experienced something similar?
Am I doing something wrong in my recording loop?
I suspect it is caused by AudioFormat.CHANNEL_IN_STEREO. A mic on device is typically a mono audio source. If because of some reason emulator supports stereo, you will be receiving twice as much data on emulator (for both channels). To verify this, try to switch to AudioFormat.CHANNEL_IN_MONO, which is guarantied to work on all devices, and see whether you receive same amount of data on emulator then.
I am testing Soundpool on a Moto-E running 5.1. It often starts with excellent latency - but then the audio begins hanging for a hundred milliseconds or more with the following message:
06-26 15:03:49.213 3865-9536/? E/DEBUG MESSAGE: Play Note BEFORE
06-26 15:03:49.331 299-876/? D/audio_hw_primary: out_set_parameters: enter: usecase(0: deep-buffer-playback) kvpairs: routing=8
06-26 15:03:49.331 299-876/? V/msm8916_platform: platform_get_output_snd_device: enter: output devices(0x8)
06-26 15:03:49.331 299-876/? V/msm8916_platform: platform_get_output_snd_device: exit: snd_device(headphones)
06-26 15:03:49.331 299-876/? D/audio_hw_extn: audio_extn_set_anc_parameters: anc_enabled:0
06-26 15:03:49.331 299-876/? E/soundtrigger: audio_extn_sound_trigger_set_parameters: str_params NULL
06-26 15:03:49.334 3865-9536/? E/DEBUG MESSAGE: Play Note AFTER
The DEBUG messages are mine. The others are system generated. Notice I am losing over 100ms. I checked my sample rate and it is good. It also doesn't happen for every note. May I ask if anyone is familiar with this type of error?
It is not an error. Your phone enters in sleep mode: you can stream music with long buffers (via the deep_buffer) and between each buffer refill, your CPU enter to sleep.
It is a normal behavior to spare the battery.
A quick solution is to comment the section containing the flags AUDIO_OUTPUT_FLAG_DEEP_BUFFER in the $system/etc/audio_policy.conf:
deep_buffer {
sampling_rates 8000|11025|12000|16000|22050|24000|32000|44100|48000
channel_masks AUDIO_CHANNEL_OUT_STEREO
formats AUDIO_FORMAT_PCM_16_BIT
devices AUDIO_DEVICE_OUT_EARPIECE|AUDIO_DEVICE_OUT_SPEAKER|AUDIO_DEVICE_OUT_WIRED_HEADSET|AUDIO_DEVICE_OUT_WIRED_HEADPHONE|AUDIO_DEVICE_OUT_ALL_SCO|AUDIO_DEVICE_OUT_AUX_DIGITAL|AUDIO_DEVICE_OUT_PROXY|AUDIO_DEVICE_OUT_LINE
flags AUDIO_OUTPUT_FLAG_DEEP_BUFFER
}
Or, to use a soft like this one https://forum.xda-developers.com/apps/magisk/module-universal-deepbuffer-remover-t3577067
I have a MusicService for MediaPlayback, wich uses a MediaPlayer with the settings:
player.setWakeMode(getApplicationContext(), PowerManager.PARTIAL_WAKE_LOCK);
player.setAudioStreamType(AudioManager.STREAM_MUSIC);
and those Listeners are set:
OnPreparedListener, OnCompletionListener, OnErrorListener, OnSeekCompleteListener
The MediaPlayer is used for mp3-Playback. When one Song is finished, onCompletion is called. Then PlayNext is called, wich resets the MediaPlayer, then sets the Datasource with the URI of the next Track. Uri is loaded with:
Uri trackUri = ContentUris
.withAppendedId(android.provider.MediaStore.Audio.Media.EXTERNAL_CONTENT_URI, songId);
Then the Player is prepared, then the player is started. This is working fine, but only sometimes, when the Device is locked and played about 1-3 Songs while it was locked and the next Song should start, the Player doesn't prepare until i hit the power button. Figured out, that if I don't hit the PowerButton it takes the Player about 2 Minutes to prepare.
I've Logged everything now and made a few custom outputs with Log.e(...).
This is put out before prepare() (prepareAsync() delivers the same result) is called:
E/MusicService: Preparing now...
This is put out, when onPrepared is called:
E/MusicService: Player prepared.
So this is the full Device-Output after "Preparing now..." is put out:
04-02 13:54:55.506 12517-12517/at.htlleonding.musync E/MusicService: Preparing now.
04-02 13:54:55.525 811-888/? E/libsuspend: Error writing to /sys/power/state: Device or resource busy
04-02 13:54:55.544 246-14756/? D/offload_visualizer: thread exit
04-02 13:54:55.546 246-14754/? V/offload_effect_bundle: offload_effects_bundle_hal_stop_output output 1879 pcm_id 9
04-02 13:54:55.546 246-14754/? D/hardware_info: hw_info_append_hw_type : device_name = speaker
04-02 13:54:55.549 246-14752/? E/audio_hw_primary: offload_thread_loop: Compress handle is NULL
04-02 13:54:55.549 246-924/? D/audio_hw_primary: adev_close_output_stream: enter:stream_handle(0xb5bfa640)
04-02 13:54:55.549 246-924/? D/audio_hw_primary: out_standby: enter: stream (0xb5bfa640) usecase(3: compress-offload-playback)
04-02 13:54:55.555 246-924/? W/AudioFlinger: moveEffectChain_l() effect chain for session 0 not on source thread 0xb59fa000
04-02 13:54:55.611 246-15030/? I/FFmpegExtractor: android-source:0xb1834060
04-02 13:54:55.820 811-888/? E/libsuspend: Error writing to /sys/power/state: Device or resource busy
04-02 13:54:55.972 246-15030/? I/FFMPEG: [mp3 # 0xae2f4400] Skipping 0 bytes of junk at 2177007.
... Then theres no output until i hit the PowerButton. Then the Song is prepared.
If someone is interested in the full output after I hit the PowerButton until "Player prepared" is called, I created a Gist here.
Sidenote:
While the App is used, a few Album-Covers are displayed in some Fragments. They are loaded with Picasso, so I don't need to worry about memory caching. Some ImageViews are filled without Picasso (for example the ImageViews that hold the drawables of my PlayerControls). Maybe there are Problems with the memory/resources?
I might have found the answer in another thread where some faced the same problem while streaming music here.
My final solution is to use a WakeLock, wich I require before preparing a Song and release again onPrepared and onError and onDestroy of my Service. It's important to release it again to save Battery. Make sure to check if the WakeLock is held before releasing it.
I create my WakeLock like this in onCreate of my Service:
PowerManager pm = (PowerManager) getApplicationContext().getSystemService(Context.POWER_SERVICE);
wakeLock = pm.newWakeLock(PowerManager.PARTIAL_WAKE_LOCK, "MusicService");
aquire:
wakeLock.acquire();
release:
if(wakeLock.isHeld())
wakeLock.release();
Tested the Playback now for about 10 Minutes and didn't stop until now. I don't know if this is a very battery-saving solution
I'm trying to produce short sequential mp4 files from CameraPreview data via MediaCodec.createInputSurface(). However, recreating the MediaCodec and it's associated Surface requires stopping the Camera to allow another call to mCamera.setPreviewTexture(...). This delay results in an unacceptable amount of dropped frames.
Therefore I need to generate the CODEC_CONFIG and END_OF_STREAM data periodically without recreating the input Surface, and thus having to call mCamera.setPreviewTexture(...). Is this possible assuming the MediaFormat is unchanged?
(I'm adapting fadden's CameraToMpegTest example. My complete code is here)
Unsuccessful attempts:
Calling MediaCodec.signalEndOfInputStream(), draining the MediaCodec, and then calling MediaCodec.flush() between chunks produces an IllegalStateException on the 2nd call to MediaCodec.signalEndOfInputStream().
Calling MediaCodec.signalEndOfInputStream(), draining the MediaCodec, and then calling MediaCodec.stop(); MediaCodec.configure(...), MediaCodec.start() between chunks without again calling MediaCodec.createInputSurface() produces the following error:
09-30 13:12:49.889 17638-17719/x.xx.xxxx E/Surface﹕ queueBuffer: error queuing buffer to SurfaceTexture, -19
09-30 13:12:49.889 17638-17719/x.xx.xxxx E/IMGSRV﹕ :0: UnlockPostBuffer: Failed to queue buffer 0x592e1e70
09-30 13:12:49.889 17638-17719/x.xx.xxxx E/CameraToMpegTest﹕ Encoding loop exception!
09-30 13:12:49.889 17638-17719/x.xx.xxxx W/System.err﹕ java.lang.RuntimeException: eglSwapBuffers: EGL error: 0x300b
09-30 13:12:49.896 17638-17719/x.xx.xxxx W/System.err﹕ at x.xx.xxxx.ChunkedHWRecorder$CodecInputSurface.checkEglError(ChunkedHWRecorder.java:731)
09-30 13:12:49.896 17638-17719/x.xx.xxxx W/System.err﹕ at x.xx.xxxx.ChunkedHWRecorder$CodecInputSurface.swapBuffers(ChunkedHWRecorder.java:713)
09-30 13:12:49.896 17638-17719/x.xx.xxxx W/System.err﹕ at x.xx.xxxx.ChunkedHWRecorder.startRecording(ChunkedHWRecorder.java:164)
09-30 13:12:49.896 17638-17719/x.xx.xxxx W/System.err﹕ at x.xx.xxxx.HWRecorderActivity$CameraToMpegWrapper.run(HWRecorderActivity.java:76)
09-30 13:12:49.896 17638-17719/x.xx.xxxx W/System.err﹕ at java.lang.Thread.run(Thread.java:841)
Solved Thanks fadden. The complete solution source is here.
The signalEndOfInputStream() call updates the state of various layers in the MediaCodec stack. You can get some sense of what operations are valid from the comments above the tests in MediaCodecTest, but by and large the behavior of MediaCodec is simply not defined for "unusual" uses.
So you have to look at the code. The lifetime of the input surface is tied to that of the OMXNodeInstance; it's represented by GraphicBufferSource. Once you signal EOS, the GraphicBufferSource will ignore additional frames (see line 426). There's no way to reset the EOS flag without tearing down the GraphicBufferSource, but when you do that it disconnects the buffer queue that underlies the Surface.
So I don't think you're going to be able to stop/restart the MediaCodec and continue to use the Surface.
However... you shouldn't need to. CameraToMpegTest routes the camera preview to a SurfaceTexture, and then renders the texture onto the encoder's input surface with GLES. The SurfaceTexture is decoupled from the encoder and shouldn't need to change. I think what needs to change is CodecInputSurface, which calls eglCreateWindowSurface() with the Surface from the MediaCodec to tell GLES where to draw. If you add a new "update Surface" API there (destroy old EGLSurface, create new EGLSurface, eglMakeCurrent), and call it whenever you spin up a new MediaCodec, I think it'll all just work.
Update to address comments:
It's important that you only change the EGLSurface. The checkAndUpdateEglStateLocked() function in GLConsumer.cpp checks to make sure the EGLDisplay and EGLContext don't change once they've been set. You can't call release()/eglSetup() in CodecInputSurface because it changes the EGLContext. You just want to destroy and recreate the EGLSurface.