I have created a simple image slider 'Live Wallpaper' that will call in an image upon DoubleTap. I know there has been many different OOM issues when dealing with images in that the Android System must convert the image into a BMP which essentially increases the image size substantially.
My question is when I activate the Live Wallpaper from the Live Wallpaper Menu screen and view it in Preview mode, it works perfectly. I am able to browse through all my images without any lag or problems. However, when I go to SET the Live Wallpaper, it crashes with an OOM error. Any reason why this could be?
Preview Mode is drawn on an opaque surface whereas your desktop draws both icons, updates and animations over it.
Have you optimized the graphics? That could help quite a bit ... something like http://trimage.org if you are using jpg/png.
Related
I have tried writing my own code for accessing camera via the camera2 API on Android instead of using the Google's example. On one hand, I've wasted way too much time understanding what exactly is going on, but on the other hand, I have noticed something quite weird:
I want the camera to produce vertical images. However, despite the fact that the ImageReader is initialized with height larger than width, the Image that I get in the onCaptureCompleted has the same size, except it is rotated 90 degrees. I was struggling trying to understand my mistake, so I went exploring Google's code. And what I have found is that their images are rotated 90 degrees as well! They compensate for that by setting a JPEG_ORIENTATION key in the CaptureRequestBuilder (if you comment that single line, the images that get saved will be rotated). This is unrelated to device orientation - in my case, screen rotation is disabled for the app entirely.
The problem is that for the purposes of the app I am making I need a non-compressed precise data from camera, so since JPEG a) compresses images b) with losses, I cannot use it. Instead I use the YUV_420_888 format which I later convert to a Bitmap. But while the JPEG_ORIENTATION flag can fix the orientation for JPEG images, it seems to do nothing for YUV ones. So how do I get the images to be correctly rotated?
One obvious solution is to rotate the resulting Bitmap, but I'm unsure what angle should I rotate it by on different devices. And more importantly, what causes such strange behavior?
Update: rotating the Bitmap and scaling it to proper size takes way too much time for the preview (the context is as follows: I need both high-res images from camera to process and a downscaled version of these same images in preview. Let's just say I'm making something similar to QR code recognition). I have even tried using RenderScripts to manipulate the image efficiently, but this is still too long. Also, I've read here that when I set multiple output surfaces simultaneously, the same resolution will be used for all of them, which is quite bad for me.
Android stores every image, no matter if it is taken in landscape or in portrait, in landscape mode. It also stores metadata that tells you if the image should be displayed in portrait or landscape.
If you don'r turn the image according to the metadata, you will end up with every image in landscape. I had that problem too (but I wanted compression so my solution doesn't work for you).
You need to read the metadata and turn it accordingly.
I hope this helps at least a bit.
I've been trying to make a 360 photo viewer similar to the Oculus 360 Photos app. The only problem is when projecting onto a sphere with inverted normals, the image "warps" or "bends" as the sphere does, and results in straight lines such as door frames turning into bending images; bad result.
Changing the size of the sphere does nothing, and obviously the picture has to bend somewhere to fit onto the inner surface of the sphere, so I don't think this solution will work.
I then tried turning the photo into a cylindrical skybox, and using it as a skybox component of the camera, which works great: no bending lines, everything looks as desired. Except for one thing: there is a shimmering/aliasing effect on the texture, unless I enable mip maps, which then results in a blurred image.
Does anybody know how I could apply my image to appear similar to those in the Oculus 360 Photo app? They render with perfect quality and no bending lines, no shimmering. How do they achieve this result?
I've tried different compression types and different shapes, the only thing I haven't tried is slicing the photo into 6 pieces and rendering it on the inside of a cube around the camera, which, due to it's proximity, might not get the shimmery result that could be cause by distance from the camera?
Thoughts, suggestions, questions? Any assistance or discussion is appreciated
I was able to get good results by increasing the renderscale to 1.5 or higher, which eliminated the shimmery aliasing effect. Not 100% sure if this was an issue due to the Samsung s6 resolution, but I just work now with an enhanced render scale for higher quality regardless, and optimise elsewhere to save on framerate
I know that question is old now but I had that problem too but on the oculus go and it is solved thanks to these instructions here & here
I am making a Game in Android using Android Studio. My Game is beign designed in Pixel Art.
The images that I put on the Game, in the app, are shown with Smooth, they are not pixelated as I have designed.
Is there a way to put the images and show them correctly? Here is an example of what happens. At left, how my App shows the images. At right, how do I need to be shown thw images.
https://stuartspixelgames.files.wordpress.com/2015/10/comparison.png?w=700
What is the supposed size of this image on screen? Perhaps your image file's dimensions are too big sometimes it becomes blur depending also on how you display it.
I've been playing around with Xcode and Unity for a while, and have made an app in Xcode which has a scrolling background. I want to basically re-create the app in Unity 2d, using the same set of images (if possible). I've imported all of the iPad sized images that I used in Xcode, but have noticed that the quality is a lot poorer when I build using Unity remote.
The image I'm most bothered about is the background, which is a repeating texture that scrolls across the screen. Do I have to mess around with the settings in Unity to get a better image quality? I'm using PNG. Surely the image can't be too small and therefore stretching because I'm using iPad sized images and running the app on Samsung Galaxy S6. Thanks in advance.
Unity Remote stream screenshots to your device, so the quality is not the final.
Maybe you can try to change the settings of UnityRemote.
See
http://docs.unity3d.com/es/current/Manual/UnityRemote4.html
More info/same question:
http://answers.unity3d.com/questions/408896/android-image-quality-is-poor-compared-to-the-qual.html
Find you image in Project tab and select it, then you should see something like this in inspector, experiment with values until you get results you are happy with.
In a game we're developing we need to animate a large sprite, of size 485x485 and the animation has about 30 frames. We have a problem animating such artifacts. I have tried some solutions as listed below, but unfortunately we haven't been able to come up with a solution yet.
Tiling
It looks like putting every frame in one big tile is not an option because:
The texture size needs to be a power of two, so it shows up as black on most devices
When I make the texture size a power of two, it becomes too big for most devices too handle
Recommended maximum texture size of AndEngine seems to be 1024x1024.
Seperate sprites
The other option is loading each texture, and thus each frame, seperately and putting it in a Sprite (as described here). This works quite well and toggling the visibility of each sprite at the right time causes the user to see the animation.
The problem with this method is that loading the whole animation takes quite some time. This is not visible when initially loading the game because of the loading screen, but later in the game the animation needs to be changed and the game needs then about 2-3 seconds to load. Putting a loading screen up is not an option.
Loading on seperate thread
I tried to put loading the textures in a seperate, newly created thread, but even while the thread loads the textures the drawing of the game seems to be paused.
Other options?
I don't know any option, and it appears no one else tried to animate a texture greater than 50x50 pixels because it is very difficult to find anyone with a similar case.
My question is: Is it even properly possible to animate large textures in AndEngine?
I think your problem is going to run up against device limitiations, not andengine limitations. Designing for mobile, there are few android devices that could run that.
However, you may be able to come up with an alternative solution using VertexShaders and FramentShaders. This si an important features of Andengine GLES2
Here is an article describing that approach:
http://code.zynga.com/2011/11/mesh-compression-in-dream-zoo/