libstreaming - how to rotate streames video? - android

libstreaming works fine - in landscape mode. Unfortunately, my app will have to run in portrait mode. No problem for the preview window - I can set it upright with
session.setPreviewOrientation(90);
But the receiver of the stream will still have the video sideways. Is there a solution for that?
In Android's Mediarecorder, there is a method
setOrientationHint(int degrees)
that will rotate the streamed/recorded video. But I did not find anything like that in libstreaming...

I have encountered this problem before. There are 3 possible solutions that I implemented.
Convert each YUV frame that comes out from onPreviewFrame API to a bitmap then rotate the bitmap and finally convert the bitmap back to YUV frame.
The disadvantage of this solution is video frame has been dropped alot (in my case from 24 to 4-5 FPS).
Rotate each YUV frame 90/270 degrees (based on orientation of camera) clockwise using code from here. The disadvantage of this solution are images will be distorted and video frame might be dropped as well.
Using an open source library from Google called libyuv. If you have knowledge of JNI, it will be easy for you. BTW, you can see demo about this lib here and here. Because rotation process handled at native layer so this is efficient solution and the best one so far.
Hope this info will be helpful for anyone who are encountering with this problem.

Change the video quality to (1280,720) from (320,240). Then once you start it should be changed and rotated. BTW how did you implement your receiver?

Related

Wrong rotation of images using Android camera2 API in Google's example code?

I have tried writing my own code for accessing camera via the camera2 API on Android instead of using the Google's example. On one hand, I've wasted way too much time understanding what exactly is going on, but on the other hand, I have noticed something quite weird:
I want the camera to produce vertical images. However, despite the fact that the ImageReader is initialized with height larger than width, the Image that I get in the onCaptureCompleted has the same size, except it is rotated 90 degrees. I was struggling trying to understand my mistake, so I went exploring Google's code. And what I have found is that their images are rotated 90 degrees as well! They compensate for that by setting a JPEG_ORIENTATION key in the CaptureRequestBuilder (if you comment that single line, the images that get saved will be rotated). This is unrelated to device orientation - in my case, screen rotation is disabled for the app entirely.
The problem is that for the purposes of the app I am making I need a non-compressed precise data from camera, so since JPEG a) compresses images b) with losses, I cannot use it. Instead I use the YUV_420_888 format which I later convert to a Bitmap. But while the JPEG_ORIENTATION flag can fix the orientation for JPEG images, it seems to do nothing for YUV ones. So how do I get the images to be correctly rotated?
One obvious solution is to rotate the resulting Bitmap, but I'm unsure what angle should I rotate it by on different devices. And more importantly, what causes such strange behavior?
Update: rotating the Bitmap and scaling it to proper size takes way too much time for the preview (the context is as follows: I need both high-res images from camera to process and a downscaled version of these same images in preview. Let's just say I'm making something similar to QR code recognition). I have even tried using RenderScripts to manipulate the image efficiently, but this is still too long. Also, I've read here that when I set multiple output surfaces simultaneously, the same resolution will be used for all of them, which is quite bad for me.
Android stores every image, no matter if it is taken in landscape or in portrait, in landscape mode. It also stores metadata that tells you if the image should be displayed in portrait or landscape.
If you don'r turn the image according to the metadata, you will end up with every image in landscape. I had that problem too (but I wanted compression so my solution doesn't work for you).
You need to read the metadata and turn it accordingly.
I hope this helps at least a bit.

How could I delay the camera preview so that it shows what's happened x seconds before?

I have a simple android app with a camera preview.
I would like to set the preview so that it shows what happened x seconds before.
I'm trying to do a buffer but it looks like there are no way to control what's inside the preview.
I use camera2 and a textureView for the preview.
Do you have any ideas, or library that could help me ?
Thanks !
You need to cache ~30 preview buffers somehow.
One possible way is to use an ImageReader where you wait for 30 onImageAvailable callbacks to fire before you acquire the first Image. But this requires you to draw the Image yourself to the preview TextureView once you start acquiring them, which is difficult to do correctly before Android Q's ImageReader with usage flags constructor.
You can also cache things in OpenGL; use a SurfaceTexture and a GLSurfaceView, and copy the SurfaceTexture frames to a list of 30 other textures in a circular buffer, then start rendering when the 30th one is drawn. But requires quite a bit of scaffolding code to implement.

Image Orientation on Camera API

I am making an app using Camera API.Although I can capture images, they are not orientated properly.I already tried using screen orientation to correctly orient the images, but it dosent work on all devices especially on front camera.I am new to android development, any help would be appreciated.Thankyou.
You can get rotation of captured bitmap using ExifInterface and create another bitmap with fixed rotation - here you can find example of usage.
Moreover, I recommend to use inBitmap option for reusing existing bitmap into created rotated one.
Camera1 and Camera2 APIs are kinda tricky ones.It is useful to know how they work inside but there are plenty of ready solutions. I can recommend the following ones:
https://camerakit.io - has fixes for rotation issues inside but
currently in beta, supports Camera2 features.
https://github.com/RedApparat/Fotoapparat - based on Camera1.
https://github.com/natario1/CameraView - based on Camera1, can
capture video.

OpenCV android native camera motion blur

I'm using the OpenCV Android native camera libraries for image capture in NDK, but while the image acquisition is happening the phone is (necessarily) moving so the images are quite blurry. What options are there to reduce this blur? I don't actually care too much about color fidelity so can sacrifice quality in that area. There doesn't appear to be direct exposure control, but I did find that doing
cap.set(CV_CAP_PROP_EXPOSURE, -4)
seemed to help a bit although -4 is the furthest it would go. I'm not terribly familiar with camera terminology in general so am not sure what effect the other properties might have for me.
Also, I noticed that in the default camera application (Galaxy SIII), the camera motion blur occurs during image preview, which makes sense since OpenCV also grabs preview images. But in video preview (non-recording) mode the image is a lot crisper. Is anyone aware of how I can access this stream?

Sample camera images as bitmap Android

I am trying to create an app which periodically samples an image in the camera(preview?) and then does some processing on this image (i.e. face detection). I think this is the way to go about this., I have looked into OpenCV but don't think my knowledge is quite up to scratch to get it implemented well enough. My idea is to sample the image (raw format?), convert this to a bitmap image which then a FaceDetector object can detect the faces in the image and indicate this on screen.
Very much like the Native Camera app on the HTC Desire, which puts a grey square around the faces it sees before taking the picture.
Sam,
A sample is provided for capturing the preview stream from the camera: CameraPreview
This would be a great place to start.

Categories

Resources