I have an onPreviewFrame callback set up. This gets a byte[] with NV21 data in it. I have set the preview size to 176*144. When device is held in landscape mode, byte[] with 176*144 dimensions is perfect but when device is held in portrait mode I still get byte[] with the same dimensions.
I want to rotate the byte[] by 90 degrees and obtain byte[] with dimensions 144*176.
So the question is, how to rotate the data, not just the preview image? Camera.Parameters.setRotation only affects taking the picture, not video. Camera.setDisplayOrientation specifically says it only affects the displaying preview, not the frame bytes:
This does not affect the order of byte array passed in
onPreviewFrame(byte[], Camera), JPEG pictures, or recorded videos.
After checking out various posts I have found this one stating to use ConvertToI420 from libyuv.
Now the deal is I have compiled libyuv and able to call libyuv::ConvertToI420 method but the resulting I420 that I get is all messed up in terms of color and showing lines and all..... however the dimensions that I get are now 144*176, can check the image here.
The code snippet that i've used is as follows.
//sourceWidth = 176 and sourceHeight = 144
unsigned char I420M = new unsigned char[(int)(sourceWidth*sourceHeight*1.5)];
unsigned int YSize = sourceWidth * sourceHeight;
// yuvPtr is the NV21 data passed from onPreviewCallback (from JAVA layer)
const uint8* src_frame = const_cast<const uint8*>(yuvPtr);
size_t src_size = YSize;
uint8* pDstY = I420M;
uint8* pDstU = I420M + YSize;
uint8* pDstV = I420M + (YSize/4);
libyuv::RotationMode mode;
if(landscapeLeft){
mode = libyuv::kRotate90;
}else{
mode = libyuv::kRotate270;
}
uint32 format = libyuv::FOURCC_NV21;
int retVal = libyuv::ConvertToI420(src_frame, src_size,
pDstY, sourceHeight,
pDstU, (sourceHeight/2),
pDstV, (sourceHeight/2),
0, 0,
sourceWidth, sourceHeight,
sourceWidth, sourceHeight,
mode,
format);
I don't wish to crop the image, just rotate it by 90 (clockwise/anticlockwise) the attached image is for kRotate90.
Could anyone please point me where am going wrong, I strongly doubt it has o do something with the parameters am passing to the ConvertToI420 method.
Any help appreciated.
use sourceWidth not sourceHeight
int retVal = libyuv::ConvertToI420(src_frame, src_size,
pDstY, sourceWidth,
pDstU, (sourceWidth/2),
pDstV, (sourceWidth/2),
0, 0,
sourceWidth, sourceHeight,
sourceWidth, sourceHeight,
mode,
format);
I have figured out what was going wrong. The above code snippet works perfectly well and I420M contains the rotated YUV with 144*176 dimensions.
The problem was the in the way I was converting the I420M to jbyte[] while passing it back to Java Layer.
Related
I'm trying to get the black and white values (the Y-plane) from the preview frame in the camera2 API. This is what I have so far:
public void onImageAvailable(ImageReader, reader) {
Image image = reader.acquireLatestImage();
Image.Plane[] planes = image.getPlanes();
ByteBuffer yPlane = planes[0].getBuffer();
if (firstRun) {
ySize = yPlane.remaining();
nv21 = new byte[ySize];
}
yPlane.get(nv21, 0, ySize);
Log.i(TAG, String.valueOf(nv21.length) + " " + String.valueOf(nv21[0]));
image.close();
}
However, the length of the array is not as expected (1280*960=1 228 800, nv21.length returns 12 979 200) and nv21[0] gives random values.
What Am I doing wrong?
Thank you in advance
The size of buffer doesn't have to be exactly 1280*960, since there can be row stride between each row of pixels. That said, a 10x difference in total size seems surprising, but not infeasible - check what the value is.
I'd recommend trying to actually draw the Y plane into an ImageView (for debugging this doesn't need to be efficient, so you can just use a Bitmap and a Canvas and drawColor), to see what it looks like. Is it just complete garbage, or is it a real Y plane with weird padding, etc?
Good morning.
I am making a camera video player using ffmpeg.
During the production process, we are confronted with one problem.
If you take one frame through ffmpeg, decode the frame, and sws_scale it to fit the screen size, it will take too long and the camera image will be burdened.
For example, when the incoming input resolution is 1920 * 1080, and the resolution of my phone is 2550 * 1440, the speed of sws_scale is about 6 times slower.
[Contrast when changing to the same size]
Currently, the NDK converts sws_scale to the resolution that was input from the camera, so the speed is improved and the image is not interrupted.
However, SurfaceView is full screen, but input resolution is below full resolution.
Scale AVFrame
ctx->m_SwsCtx = sws_getContext(
ctx->m_CodecCtx->width,
ctx->m_CodecCtx->height,
ctx->m_CodecCtx->pix_fmt,
//width, // 2550 (SurfaceView)
//height, // 1440
ctx->m_CodecCtx->width, // 1920 (Camera)
ctx->m_CodecCtx->height, // 1080
AV_PIX_FMT_RGBA,
SWS_FAST_BILINEAR,
NULL, NULL, NULL);
if(ctx->m_SwsCtx == NULL)
{
__android_log_print(
ANDROID_LOG_DEBUG,
"[ VideoStream::SetResolution Fail ] ",
"[ Error Message : %s ]",
"SwsContext Alloc fail");
SET_FIELD_TO_INT(pEnv, ob, err, 0x40);
return ob;
}
sws_scale(
ctx->m_SwsCtx,
(const uint8_t * const *)ctx->m_SrcFrame->data,
ctx->m_SrcFrame->linesize,
0,
ctx->m_CodecCtx->height,
ctx->m_DstFrame->data,
ctx->m_DstFrame->linesize);
PDRAWOBJECT drawObj = (PDRAWOBJECT)malloc(sizeof(DRAWOBJECT));
if(drawObj != NULL)
{
drawObj->m_Width = ctx->m_Width;
drawObj->m_Height = ctx->m_Height;
drawObj->m_Format = WINDOW_FORMAT_RGBA_8888;
drawObj->m_Frame = ctx->m_DstFrame;
SET_FIELD_TO_INT(pEnv, ob, err, -1);
SET_FIELD_TO_LONG(pEnv, ob, addr, (jlong)drawObj);
}
Draw SurfaceView;
PDRAWOBJECT d = (PDRAWOBJECT)drawObj;
long long curr1 = CurrentTimeInMilli();
ANativeWindow *window = ANativeWindow_fromSurface(pEnv, surface);
ANativeWindow_setBuffersGeometry(window, 0, 0, WINDOW_FORMAT_RGBA_8888);
ANativeWindow_setBuffersGeometry(
window,
d->m_Width,
d->m_Height,
WINDOW_FORMAT_RGBA_8888);
ANativeWindow_Buffer windowBuffer;
ANativeWindow_lock(window, &windowBuffer, 0);
uint8_t * dst = (uint8_t*)windowBuffer.bits;
int dstStride = windowBuffer.stride * 4;
uint8_t * src = (uint8_t*) (d->m_Frame->data[0]);
int srcStride = d->m_Frame->linesize[0];
for(int h = 0; h < d->m_Height; ++h)
{
// Draw SurfaceView;
memcpy(dst + h * dstStride, src + h * srcStride, srcStride);
}
ANativeWindow_unlockAndPost(window);
ANativeWindow_release(window);
Result;
enter image description here
I would like to change the whole screen from full screen to full screen. Is there a way to change the size of a SurfaceView in NDK or Android, rather than sws_scale?
Thank you.
You don't need to scale your video. Actually, you don't even need to convert it to RGB (this is also a significant burden for the CPU).
The trick is to use OpenGL render with a shader that takes YUV input and displays this texture scaled tho your screen.
Start with this solution (reusing code from Android system): https://stackoverflow.com/a/14999912/192373
I'm using Camera 2 API to save JPEG images on disk. I currently have 3-4 fps on my Nexus 5X, I'd like to improve it to 20-30. Is it possible?
Changing the image format to YUV I manage to generate 30 fps. Is it possible to save them at this frame-rate, or should I give up and live with my 3-4 fps?
Obviously I can share code if needed, but if everyone agree that it's not possible, I'll just give up. Using the NDK (with libjpeg for instance) is an option (but obviously I'd prefer to avoid it...).
Thanks
EDIT: here is how I convert the YUV android.media.Image to a single byte[]:
private byte[] toByteArray(Image image, File destination) {
ByteBuffer buffer0 = image.getPlanes()[0].getBuffer();
ByteBuffer buffer2 = image.getPlanes()[2].getBuffer();
int buffer0_size = buffer0.remaining();
int buffer2_size = buffer2.remaining();
byte[] bytes = new byte[buffer0_size + buffer2_size];
buffer0.get(bytes, 0, buffer0_size);
buffer2.get(bytes, buffer0_size, buffer2_size);
return bytes;
}
EDIT 2: another method I found to convert the YUV image into a byte[]:
private byte[] toByteArray(Image image, File destination) {
Image.Plane yPlane = image.getPlanes()[0];
Image.Plane uPlane = image.getPlanes()[1];
Image.Plane vPlane = image.getPlanes()[2];
int ySize = yPlane.getBuffer().remaining();
// be aware that this size does not include the padding at the end, if there is any
// (e.g. if pixel stride is 2 the size is ySize / 2 - 1)
int uSize = uPlane.getBuffer().remaining();
int vSize = vPlane.getBuffer().remaining();
byte[] data = new byte[ySize + (ySize/2)];
yPlane.getBuffer().get(data, 0, ySize);
ByteBuffer ub = uPlane.getBuffer();
ByteBuffer vb = vPlane.getBuffer();
int uvPixelStride = uPlane.getPixelStride(); //stride guaranteed to be the same for u and v planes
if (uvPixelStride == 1) {
uPlane.getBuffer().get(data, ySize, uSize);
vPlane.getBuffer().get(data, ySize + uSize, vSize);
}
else {
// if pixel stride is 2 there is padding between each pixel
// converting it to NV21 by filling the gaps of the v plane with the u values
vb.get(data, ySize, vSize);
for (int i = 0; i < uSize; i += 2) {
data[ySize + i + 1] = ub.get(i);
}
}
return data;
}
The dedicated JPEG encoder units on mobile phones are efficient, but not generally optimized for throughput. (Historically, users took one photo every second or two). At full resolution, the 5X's camera pipeline will not generate JPEGs at faster than a few FPS.
If you need higher rates, you need to capture in uncompressed YUV. As mentioned by CommonsWare, there's not enough disk bandwidth to stream full-resolution uncompressed YUV to disk, so you can only hold on to some number of frames before you run out of memory.
You can use libjpeg-turbo or some other high-efficiency JPEG encoder and see how many frames per second you can compress yourself - this may be higher than the hardware JPEG unit. The simplest way to maximize the rate is to capture YUV at 30fps, and run some number of JPEG encoding threads in parallel. For maximum speed, you'll want to hand-write the code talking to the JPEG encoder, because your source data is YUV, not RGB, which most JPEG encoding interfaces tend to accept (even though typically the colorspace of an encoded JPEG is actually YUV as well).
Whenever an encoder thread finishes the previous frame, it can grab the next frame that comes from the camera (you can maintain a small circular buffer of the latest YUV Images to make this simpler).
I'm trying to use the NDK to do some image processing. I am NOT using opencv.
I am fairly new to Android so I was doing this in steps. I started by writing a simple app that would let me capture video from the camera and display it to the screen. I have this done.
Then I tried to manipulate the camera data in native. However, onPreviewFrame uses a byte array to capture frame information. This is my code -
public void onPreviewFrame(byte[] arg0, Camera arg1)
{
if (imageFormat == ImageFormat.NV21)
{
if ( !bProcessing )
{
FrameData = arg0;
mHandler.post(callnative);
}
}
}
And the callnative runnable is like so -
private Runnable callnative = new Runnable()
{
public void run()
{
bProcessing = true;
String returnNative = callTorch(MainActivity.assetManager, PreviewSizeWidth, PreviewSizeHeight, FrameData, pixels);
bitmap.setPixels(pixels, 0, PreviewSizeWidth, 0, 0, PreviewSizeWidth, PreviewSizeHeight);
MycameraClass.setImageBitmap(bitmap);
bProcessing = false;
}
};
The problem is, I need to use FrameData in native as the float datatype. However, it is in the form of a bytearray. I wanted to know how the frame data is stored. Is this a 2 dimensional array of bytes? So the camera returns an 8 bit image and stores this as 640x480 bytes? If that is so, in what form does C interpret this byte data type? Can I simply convert it to float? I have this in native -
jbyte *nativeData;
nativeData = (env)->GetByteArrayElements(NV21FrameData,NULL);
__android_log_print(ANDROID_LOG_INFO, "Nativeprint", "nativedata is: %d",(int)nativeData[0]);
However, this prints -22 which leads me to believe that I am trying to print out a pointer. I am not sure why that is the case though.
I would appreciate any help on this.
You will not be able to get any float data type from the pixels buffer. the data are in bytes, which in C is the char datatype.
So this:
jbyte *nativeData = (env)->GetByteArrayElements(NV21FrameData,NULL);
is the same as this:
char *nativeData = (char *)((env)->GetByteArrayElements(NV21FrameData, NULL));
The data is stored as 1 dimension array, so you will retrieve each pixel operating by width, height, and x and y calculations.
Also remember the preview camera frames from your sample are in YUV420sp, this means you will need to convert the data from YUV to RGB before you can set it in a bitmap.
On camera preview frame I get data in yv12 format on Android side. I need to convert it to YUV420P on jni side. How can I do it? As I have read from many sources in YUV420P format y samples appears first which is followed by u samples. u samples are followed by v sample. yv12 format is same as YUV420P except u and v samples appears in reverse order, that means y samples are followed by v and then u samples. Keeping that in mind I have used following swapping code to produce YUV420P data from yv12 data format before encoding.
avpicture_fill((AVPicture*)outframe, (uint8_t*)camData, codecCtx->pix_fmt, codecCtx->width, codecCtx->height);
uint8_t * buf_store = outframe->data[1];
outframe->data[1]=outframe->data[2];
outframe->data[2]=buf_store;
But it does not seems to be working. How should I adjust my code?
Do, t use avpicture_fill. I have implemented in my application like this and its working fine
picture->linesize[0] = frameWidth;
picture->linesize[1] = frameWidth/2;
picture->linesize[2] = frameWidth/2;
picture->data[0] = camData;
picture->data[1] = camData + picture->linesize[0]*frameHeight+picture->linesize[1]*frameHeight/2;
picture->data[2] = camData + picture->linesize[0]*frameHeight;
May be you need this method:
//yv12 to yuv420p
public static void swapYV12toI420(final byte[] yv12bytes, final byte[] i420bytes, int width, int height) {
int size = width * height;
int part = size / 4;
System.arraycopy(yv12bytes, 0, i420bytes, 0, size);
System.arraycopy(yv12bytes, size + part, i420bytes, size, part);
System.arraycopy(yv12bytes, size, i420bytes, size + part, part);
}
For every YV12 data packages you received, swap to i420.