AndEngine RenderTexture Exception: GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT - android

I have developed an Android game which is played by many people. One user out of 100-200 faces an Exception that I cannot make any sense of.
I use a RenderTexture which throws the following Exception when I try to initialize it:
Fatal Exception: org.andengine.opengl.exception.RenderTextureInitializationException
org.andengine.opengl.exception.GLFrameBufferException: GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT
It works on 99% of all devices. The init-method looks like this:
public void init(final GLState pGLState) throws GLFrameBufferException, GLException {
this.savePreviousFramebufferObjectID(pGLState);
try {
this.loadToHardware(pGLState);
} catch (final IOException e) {
/* Can not happen. */
}
/* The texture to render to must not be bound. */
pGLState.bindTexture(0);
/* Generate FBO. */
this.mFramebufferObjectID = pGLState.generateFramebuffer();
pGLState.bindFramebuffer(this.mFramebufferObjectID);
/* Attach texture to FBO. */
GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_TEXTURE_2D, this.mHardwareTextureID, 0);
try {
pGLState.checkFramebufferStatus();
} catch (final GLException e) {
this.destroy(pGLState);
throw new RenderTextureInitializationException(e);
} finally {
this.restorePreviousFramebufferObjectID(pGLState);
}
this.mInitialized = true;
}
It seems like something is wrong with the FrameBuffer-Status...
Update
A list of phones where the crash happened so far:
Sony - Sony Tablet S
TCT - ALCATEL ONE TOUCH 5020A
TCT - ALCATEL ONE TOUCH 6030N
VNPT Technology - VNPT Technology Smart Box
Q-Smart - S32
LGE - LG-E465g
LGE - LG-D682TR
LGE - LG-E451g
LGE - LG-D686
LGE - LG-E470f
HUAWEI - MediaPad 7 Youth
unknown - Bliss Pad B9712KB
samsung - GT-P5110
samsung - GT-I9505
samsung - Galaxy Nexus
samsung - GT-P3110
samsung - GT-P5100
samsung - GT-P3100
samsung - GT-I9105P
samsung - GT-I9082
samsung - GT-I9082L
samsung - GT-I9152
samsung - GT-P3113
E1A - E1A
LNV - LN1107
motorola - XT920
motorola - XT915
asus - ME172V

Based on the code you linked to, it looks like you are trying to render to an RGBA8888 texture. This isn't always available on OpenGL ES 2.0 devices, as it dates back to a time when most devices were using 16-bit displays.
The only mandatory formats in OpenGL ES 2.x are documented in the specification under the error code you are getting ...
https://www.khronos.org/opengles/sdk/docs/man/xhtml/glCheckFramebufferStatus.xml
RGBA8 targets are only available if this extension is exposed:
https://www.khronos.org/registry/gles/extensions/OES/OES_rgb8_rgba8.txt
... so it's highly like that some users are using an older device with a GPU which isn't exposing this extension. To check if the extension is supported use glGetString (see below) with GL_EXTENSIONS:
https://www.khronos.org/opengles/sdk/docs/man/xhtml/glGetString.xml
... and see if the OES_rgb_rgba8 value is present in this list. If it isn't then your only real option is to fall back to something else in the mandatory ES2.x format set such as RGB565 or RGB5_A1.

Related

MediaRecorder and VideoSource.SURFACE, stop failed: -1007 (a serious Android bug)

I'm trying to record MediaRecorder without using Camera instance but using Surface video source (yes it's possible, but it turned out that it's not that perfect) - mediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
I just write what the issue:
Next code works only on some devices and works temporary on some devices after a recent device rebooting or doesn't work at all
If it doesn't work ok MediaRecorder.stop() method fails with the next error
E/MediaRecorder: stop failed: -1007 W/System.err:
java.lang.RuntimeException: stop failed. at
android.media.MediaRecorder.stop(Native Method)
recorder mp4 file is too small size (kilobytes) and it can't be played
Tested devices:
works on Lenovo P2, Xiaomi Mi A1
doesn't work on Xiaomi Redmi 5, Sony Xperia, Xiaomi Redmi 4 Prime
Also you can read comments in my code to understand the issue better
new Thread(() -> {
MediaRecorder mediaRecorder = new MediaRecorder();
File file = new File(Environment.getExternalStorageDirectory()
+ File.separator + "test_media_recorder_surface_source.mp4");
if (file.exists()) {
file.delete();
}
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mediaRecorder.setOutputFile(file.getAbsolutePath());
mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
mediaRecorder.setVideoSize(1280, 720);
mediaRecorder.setCaptureRate(24);
try {
mediaRecorder.prepare();
int sleepTime = 1000 / 24;
Surface surface = mediaRecorder.getSurface();
mediaRecorder.start();
// record something (we can also record frames here from onPreviewFrame byte arrays)
// e.g. convert raw frame byte[] to Bitmap using mb OpenCV and then draw bitmap on canvas
// using canvas.drawBitmap(...)
// here we record just blue background...
for (int i = 0; i < 120; i++) { // 5 seconds, 24 fps
Canvas canvas = surface.lockCanvas(null);
canvas.drawColor(Color.BLUE);
surface.unlockCanvasAndPost(canvas);
try {
Thread.sleep(sleepTime);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
// on many devices stop fails with RuntimeException -1007 error code
// I guess it works ok 100% only for modern powerful devices...
mediaRecorder.stop();
// E/MediaRecorder: stop failed: -1007
// W/System.err: java.lang.RuntimeException: stop failed.
// at android.media.MediaRecorder.stop(Native Method)
// recorder.reset();
mediaRecorder.release();
// I get file with very small size (kilobytes) and it can't be played
// ######## RESULTS ######
// WORKS OK ON:
// - Lenovo P2 (Android 7)
// - Xiaomi Mi A1 (Android 8)
// DOESN'T WORK ON (stop fails with -1007, small video file and can't be played):
// - Xiaomi Redmi 5 (Android 7)
// - Sony Xperia (I don't remember the exact model and Android OS)
// - Xiaomi Redmi 4 Prime (Android 6) *
// * p.s. on Xiaomi Redmi 4 Prime it works some time after rebooting the device
// if I leave this smartphone for a while and try again it will fail again
// until I reboot the device...
} catch (Exception e) {
e.printStackTrace();
}
}).start();
UPDATE #1
seems some progress what could be the issue - codes issue (mp4/h264)
it works better with WEBM/VP8, videos can be played now, but something wrong with fps, it shows 1000 in proporties
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.WEBM);
mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.VP8);
also MediaRecord doesn't record audio when using
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.VORBIS);
check Android MediaRecorder crashes on stop when using MP4/H264 and a resolution bigger than 720p
so it also happens when you use MediaRecorder and MediaProjection to record/capture device Screen (because it also uses Surface...)
UPDATE 2
yes seems vp8 codec works fine, but one problem for webm container - NO AUDIO!
buggy Android just doesn't support VORBIS/OGG audio encoding... https://developer.android.com/guide/topics/media/media-formats#audio-formats
I guess there is no solution
so the answer: MediaRecorder/Android is buggy or Mobile companies didn't care of all Android features while developing their devices
Update
MediaCodec is also buggy with canvas
mSurface = mMediaCodec.createInputSurface();
mSurface.lockHardwareCanvas()
It works on much more devices with MediaCodec but still some devices may fail to record video correctly using this method
So final answer: don't ever use lockCanvas or lockHardwareCanvas when working with MediaCodec or MediaRecorder, it's buggy..
The only way - OpenGl ES
other links about issue:
https://github.com/googlesamples/android-Camera2Video/issues/86
https://issuetracker.google.com/issues/111433520

Android microphone constantly gives 32639 or -32640 on newer devices

I've implemented code similar to this. I have a noise alert go off in the Log, but it always gives 32639 or -32640 regardless of what kind of noise is going on outside.
short[] buffer = new short[minSize];
boolean thresholdMet = false;
int threshold = sliderThreshold.getProgress();
ar.read(buffer, 0, minSize);
//Iterate through each chunk of amplitude data
//Check if amplitude is greater than threshold
for (short s : buffer) {
if (Math.abs(s) > threshold) {
thresholdMet = true;
Log.w("NoiseThreshold", String.valueOf(s));
break;
}
}
I've tested it on three phones (none of which are rooted):
Samsung Galaxy S3(API 19)
HTC One M9(API 23)
Samsung Galaxy S7(API 24)
It works on the S3, but not the others. I've tried using Sensor Sense on the HTC and it doesn't work for the mic sensor. However, it used to work, and now seems to detect one sample every five seconds or so in the graph view.
Oddly enough, the microphone still works fine for phone calls and video recording on the malfunctioning phones.
You said it works on S3, which is API 19 and doesn't on those with API>=23. So, it's possible that you have a problem with runtime permissions introduced in API 23.
New behaviour (for "old apps" which use static permission model) is to return dummy data if runtime permission is not granted.
Check out this answer:
Request permission for microphone on Android M

Qt5, opengl es 2 extensions required

I am developing an Android application based on Qt5.4/QtQuick 2 and opengles 2.
I installed textureinsgnode in order to check how well it runs in my target device(as my app is making massive use of FBOs) and I am getting 18 FPS. I checked on a Samsung SM-T535 and I get around 47/48 FPS, and my application looks like colapsed when trying to make any user action.
I checked the extension available in both devices(my target and Samsung tablet) by:
QOpenGLFramebufferObject *createFramebufferObject(const QSize &size) {
QSet<QByteArray> extensions = QOpenGLContext::currentContext()->extensions();
foreach (QByteArray extension, extensions) {
qDebug() << extension;
}
QOpenGLFramebufferObjectFormat format;
format.setAttachment(QOpenGLFramebufferObject::CombinedDepthStencil);
format.setSamples(4);
return new QOpenGLFramebufferObject(size, format);
}
And I am getting a very sort list of extensions in the target device If I compare with Samsung tablet list in the same point:
"GL_OES_rgb8_rgba8"
"GL_EXT_multisampled_render_to_texture" "GL_OES_packed_depth_stencil"
"GL_ARM_rgba8" "GL_OES_vertex_half_float" "GL_OES_EGL_image"
"GL_EXT_discard_framebuffer" "GL_OES_compressed_ETC1_RGB8_texture"
"GL_OES_depth_texture" "GL_KHR_debug" "GL_ARM_mali_shader_binary"
"GL_OES_depth24" "GL_EXT_texture_format_BGRA8888"
"GL_EXT_blend_minmax" "GL_EXT_shader_texture_lod"
"GL_OES_EGL_image_external" "GL_EXT_robustness" "GL_OES_texture_npot"
"GL_OES_depth_texture_cube_map" "GL_ARM_mali_program_binary"
"GL_EXT_debug_marker" "GL_OES_get_program_binary"
"GL_OES_standard_derivatives" "GL_OES_EGL_sync"
I installed an NME(3.4.4, so based in opengl es 1.1) application (BunnyMark) and I can get around 45-48 FPS.
By basing in this tests, I can think the target device is having some problem with opengl es 2 but I am not able to find (I was googleing) on any place what opengl es 2 extension requires Qt5.4 to work properly.
The question: What are the opengl es 2 extensions required by Qt5.4 and is there a way to check it?

ConsumerIrManager.transmit broken in Lollipop?

I upgraded my Samsung Galaxy S4 from latest KitKat to Lollipop (5.0.1) yesterday and my IR remote control app that I have used for months stopped working.
Since I was using a late copy of KitKat ConsumerIrManager, the transmit( ) function was sending the number of pulses using the code below. It worked very nicely.
private void irSend(int freqHz, int[] pulseTrainInMicroS) {
int [] pulseCounts = new int [pulseTrainInMicroS.length];
for (int i=0; i<pulseTrainInMicroS.length; i++) {
long iValue = pulseTrainInMicroS[i] * freqHz / 1000000;
pulseCounts[i] = (int) iValue;
}
m_IRService.transmit(freqHz, pulseCounts);
}
when it stopped working yesterday, I began looking closely at it.
I noticed that the transmitted waveform is not having any relationship with the requested pulse train. even the code below doesn't work correctly! there is
private void TestSend() {
int [] pulseCounts = {100, 100, 100};
m_IRService.transmit(38000, pulseCounts);
}
the resulting waveforms had many problems and so are entirely useless.
the waveforms were entirely wrong
the frequency was wrong and the pulse spacing was not regular
they were not repeatable
looking at the demodulated waveform:
if my 100, 100, 100 were correctly rendered, I should have seen two pulses 2.6ms (before 4.4.3(?) 100 us) long. instead I received (see attached) "[demodulated] not repeatable 1.BMP" and "[demodulated] not repeatable 2.BMP". note that the waveform isn't 2 pulses...in fact, it's not even repeatable.
as for the captures below, the signal goes low when the IR is detected.
we should have seen two pulses going low for 2.6 ms and 2.6 ms between them (see green line below).
I had also tried shorter pulses using 50, 50, 50 and have observed that the first pulse isn't correct either (see below).
looking at the modulated waveform:
the frequency was not correct; instead, it was about 18kHz and irregular.
I'm quite experienced with this and have formal education in electronics.
It seems to me there's a bug in ConsumerIrManager.transmit( )...
curiously, the "WatchOn" application that comes with the phone still works.
thank you for any insights you can give.
Test equipment:
Tektronix TDS-2014B, 100 MHz, used in peak-detect mode.
As #IvanTellez says, a change was made in Android in respect to this functionality. Strangely, when I had it outputting simple IR signals (for troubleshooting purposes), the function behaves as shown above (erratically, wrong carrier frequency, etc). When I eventually returned to normal types of IR signals, it worked correctly.

How to get CamcorderProfile.videoBitRate for an Android device?

My app uses HLS to stream video from a server, but when I request the HLS stream from the server I need to pass it the max video bitrate the device can handle. In the Android API guides it says that "a device's available video recording profiles can be used as a proxy for media playback capabilities," but when I try to retrieve the videoBitRate for the devices back-facing camera it always comes back as 12Mb/s regardless of the device (Galaxy Nexus, Galaxy Tab Plus 7", Galaxy Tab 8.9), despite the fact that they have 3 different GPUs (PowerVR SGX540, Mali-400 MP, Tegra 250 T20). Here's my code, am I doing something wrong?
CamcorderProfile camcorderProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH);
targetVideoBitRate = camcorderProfile.videoBitRate;
If I try this on the Galaxy Tab Plus:
boolean hasProfile = CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_HIGH);
it returns True, despite the fact that QUALITY_HIGH is for 1080p recording and the specs say it can only record at 720p.
It looks like I've found the answer to my own question.
I didn't read the documentation closely enough, QUALITY_HIGH is not equivalent to 1080p it is simply a way of specifying the highest quality profile the device supports. Therefore, by definition, CamcorderProfile.hasProfile( CamcorderProfile.QUALITY_HIGH ) is always true. I should have written something like this:
public enum mVideoQuality {
FullHD, HD, SD
}
mVideoQuality mMaxVideoQuality;
int mTargetVideoBitRate;
private void initVideoQuality {
if ( CamcorderProfile.hasProfile( CamcorderProfile.QUALITY_1080P ) ) {
mMaxVideoQuality = mVideoQuality.FullHD;
} else if ( CamcorderProfile.hasProfile( CamcorderProfile.QUALITY_720P ) ) {
mMaxVideoQuality = mVideoQuality.HD;
} else {
mMaxVideoQuality = mVideoQuality.SD;
}
CamcorderProfile cProfile = CamcorderProfile.get( CamcorderProfile.QUALITY_HIGH );
mTargetVideoBitRate = cProfile.videoBitRate;
}
Most of my devices are still reporting support for 1080p encoding, which I'm skeptical of, however I ran this code on a Sony Experia Tipo ( my low end test device ) and it reported a max encode quality of 480p with a videoBitRate of 720Kb/s.
As I said, I'm not sure if every device can be trusted, but I have seen a range of video bitrates from 720Kb/s to 17Mb/s and Profile qualities from 480p - 1080p. Hopefully other people will find this information to be useful.

Categories

Resources