Our application is related to showing live vidoe data received from other end, we need to display live feeds at an interval of 40 ms ,
The data will receive in YUV Format and it seems android doesn't have any inbuilt support to display the YUV data,
This below is the code to manage and show the data to the Screen,
// convert the data to RGB
feedProcess.decode(yuvBuffer,
yuvBuffer.length, imgInfo, imgRaw, ref,
webMIndex);
currentTime=new Date().getTime();
System.out
.println("took "+(currentTime-lastTime) +" ms to decode the buffer " );
imgQ.add(imgRaw);
In Another thread i will receiving data and converting it into the Bitmap
public void run() {
// TODO Auto-generated method stub
while(myThreadRun){
if(!imgQ.isEmpty()){
try{
byte[] arry=imgQ.poll();
Bitmap b=createImgae(arry, imgInfo[0], imgInfo[1], 1,width,height);
myThreadSurfaceView.setBitmap(b);
try {
// draw the image
c = myThreadSurfaceHolder.lockCanvas(null);
synchronized (myThreadSurfaceHolder) {
myThreadSurfaceView.onDraw(c);
}
} finally {
if (c != null) {
myThreadSurfaceHolder
.unlockCanvasAndPost(c);
}
}
}catch(NoSuchElementException ex){
}
}
}
}
This Entire logic is taking approx 100 ms to refresh the screen with the new image, are there any other approches that i can try it out ,
Decode function uncompress which takes 10-15 ms + convert YUV to RGB ( 20-30)ms , and this is done in JNI Code,
My understanding is , if YUV data can be shown directly then we can save some time from here,
Please tell your views
Related
I was trying to process the live camera feed from flutter so I needed to send the byte data to nativve Android for processing.
I was concatenating the 3 planes as suggested by flutterfire. I needed the data to create an InputImage for the google ml kit.
Concatenate Plane method
static Uint8List _concatenatePlanes(List<Plane> planes) {
final WriteBuffer allBytes = WriteBuffer();
for (Plane plane in planes) {
allBytes.putUint8List(plane.bytes);
}
return allBytes.done().buffer.asUint8List();
}
Input Image created from the image data in native Android
public void fromByteBuffer(Map<String, Object> imageData, final MethodChannel.Result result) {
byte[] bytes = (byte[]) imageData.get("bytes");
int rotationCompensation = ((int) imageData.get("rotation")) % 360;
//Create an input image
InputImage inputImage = InputImage.fromByteArray(bytes,
(int) imageData.get("width"),
(int) imageData.get("height"),
rotationCompensation,
InputImage.IMAGE_FORMAT_NV21);
}
I am not getting any results after processing a frame from camera stream, but if the same frame is captured, stored and then processed by creating the Input Image from file path the image is being processed properly.
Is there anything wrong in the way I am creating the input image.
Any help is appreciated.
I have implemented a loop buffer (or circular buffer) storing 250 frames raw video data in total (frame resolution 1280x720). As a buffer I am using the ByteBuffer class. The buffer is running in a separate thread using a Looper, every new frame is passed via a message to the thread Handler object. When the limit is reached, the position is set to 0 and the whole buffer is overwritten from the beginning. Like that, the buffer always contains the last 250 video frames.
As the amount of required heap space is huge (around 320 MByte) I am using the tag android:largeHeap="true" in the manifest.
Now we come to the problem. The loop is running well, it consumes slightly less than the allowed heap space size (which is acceptable for me). But at some point of time, I want to store the whole buffer to a raw binary file while respecting the current position of the buffer.
Let me explain that with a small graph:
The loop buffer looks like this:
|========== HEAD ==========|===============TAIL============|
0 -------------------------buffer.position()-----------------------buffer.limit()
At the time of saving, I want to first store the tail to the file (because it contains the beginning of the video) and afterwards the head until the current buffer.position(). I cannot allocate any more byte arrays for extracting the data from the ByteBuffer (heap space is full), thus, I have to directly write the ByteBuffer to the file.
At the moment ByteBuffer does only allow to be written to a file completely (write() method.) Does anybody know what could be the solution? Or is there even a better solution for my task?
I will give my code below:
public class FrameRecorderThread extends Thread {
public int MAX_NUMBER_FRAMES_QUEUE = 25 * 10; // 25 fps * 10 seconds
public Handler frameHandler;
private ByteBuffer byteBuffer;
byte[] image = new byte[1382400]; // bytes for one image
#Override
public void run() {
Looper.prepare();
byteBuffer = ByteBuffer.allocateDirect(MAX_NUMBER_FRAMES_QUEUE * 1382400); // A lot of memory is allocated
frameHandler = new Handler() {
#Override
public void handleMessage(Message msg) {
// Store message content (byte[]) to queue
if(msg.what == 0) { // STORE FRAME TO BUFFER
if(byteBuffer.position() < MAX_NUMBER_IMAGES_QUEUE * 1382400) {
byteBuffer.put((byte[])msg.obj);
}
else {
byteBuffer.position(0); // Start overwriting from the beginning
}
}
else if(msg.what == 1) { // SAVE IMAGES
String fileName = "VIDEO_BUF_1.raw";
File directory = new File(Environment.getExternalStorageDirectory()
+ "/FrameRecorder/");
directory.mkdirs();
try {
FileOutputStream outStream = new FileOutputStream(Environment.getExternalStorageDirectory()
+ "/FrameRecorder/" + fileName);
// This is the current position of the split between head and tail
int position = byteBuffer.position();
try {
// This stores the whole buffer in a file but does
// not respect the order (tail before head)
outStream.getChannel().write(byteBuffer);
} catch (IOException e) {
e.printStackTrace();
}
} catch (FileNotFoundException e) {
Log.e("FMT", "File not found. (" + e.getLocalizedMessage() + ")");
}
}
else if(msg.what == 2) { // STOP LOOPER
Looper looper = Looper.myLooper();
if(looper != null) {
looper.quit();
byteBuffer = null;
System.gc();
}
}
}
};
Looper.loop();
}}
Thank you very much in advance!
Just create a subsection and write that to a file.
Or call set Length and then write it and then set it back.
Ok, in the meanwhile I have investigated a little further and found a solution.
Instead of a ByteBuffer object I am using a simple byte[] array. In the beginning I am allocating all heap space required for the frames. At the time of storing it, I can then write the head and tail of the buffer by using the current position. This works and is easier than expected. :)
See all 3 edits below
Update: Changed 'read()' to 'readFully()' and got some things working. 'decodeByteArray()' no longer returns null. The jpeg written to the card is actually a full image. The image is still not being drawn to my view. I will keep working and post another question if need be.
I am writing a video streaming client program that takes in images from a DataInputStream.
Because the server calls compressToJpeg() on these images, I figured I could simply take the byte array from my DataInputStream, decode it, and draw the Bitmap from decodeByteArray() to my SurfaceView. Unfortunately, that method only leaves me with decodeByteArray() returning null.
So I added an intermediate step. I created a compressFrame() method which simply takes the byte array from the stream, creates a new YuvImage instance, and uses that to call compressToJpeg(). When run, however, this method hangs on compressToJpeg and closes my client activity. There is no FC popup, and I get no errors in my log.
I have provided the run() of the thread calling the problem method, the method itself, and the log printed before the app closes.
EDIT: I've also added the ReceiverUtils.getFrame() just in case.
#Override
public void run() {
String fromServer = null;
int jpgSize;
byte[] buffer;
byte[] jpgData;
Bitmap image;
ByteArrayOutputStream jpgBaos = null;
while(isItOK && (receiver.textIn != null)){
if(!holder.getSurface().isValid()){
continue;
}
ReceiverUtils.init(holder);
//Receive 'ready' flag from camcorder
fromServer = ReceiverUtils.receiveText(receiver);
while((fromServer != null)){
Log.e("From server: ", fromServer); //logging here because of null pointers when done in Utils
if(fromServer.startsWith("ANDROID_JPG_READY")){
//Obtain size for byte array
jpgSize = Integer.parseInt(fromServer.substring(18));
Log.e("JPG SIZE", String.valueOf(jpgSize));
//Create byte array of size jpgSize
buffer = new byte[jpgSize];
Log.e("BUFFER SIZE", String.valueOf(buffer.length));
//Send flag and receive image data from camcorder
ReceiverUtils.sendText(receiver, "ANDROID_JPG_SEND");
jpgData = ReceiverUtils.receiveData(receiver, buffer);
//Compress jpgData and write result to jpgBaos
jpgBaos = ReceiverUtils.compressFrame(imgRect, jpgData);
//Decode jpgData into Bitmap
image = ReceiverUtils.getFrame(jpgBaos.toByteArray(), jpgSize);
//image = ReceiverUtils.getFrame(jpgData, jpgSize);
//Set buffer and jpgData to null since we aren't using them
buffer = null;
jpgData = null;
//Draw Bitmap to canvas
ReceiverUtils.draw(image, holder, imgRect);
//break; (testing one run-through)
}
//Receive 'ready' flag from camcorder
fromServer = ReceiverUtils.receiveText(receiver);
if(isItOK == false){
break;
}
}
//break; (testing one run-through)
}
}
This is the ReceiverUtils.compressFrame() called in the previous snippet.
/*----------------------------------------
* compressFrame(Rect imgRect, byte[] input): Creates new instance of YuvImage used to compress ARGB
* byte array data to jpeg. Returns ByteArrayOutputStream with data.
*/
public static ByteArrayOutputStream compressFrame(Rect imgRect, byte[] input){
boolean success = false;
ByteArrayOutputStream baos = new ByteArrayOutputStream();
YuvImage yuvImage = new YuvImage(input, ImageFormat.NV21,
imgRect.width(), imgRect.height(), null);
Log.e("HEY", "HEY");
success = yuvImage.compressToJpeg(imgRect, 80, baos);
Log.e("Compression Success:", String.valueOf(success));
return baos;
}
Finally, this is the output I'm getting from my LogCat. (Beware some obviously hastily written debug logs.)
07-03 15:01:19.754: E/From server:(1634): ANDROID_JPG_READY_26907
07-03 15:01:19.754: E/JPG SIZE(1634): 26907
07-03 15:01:19.764: E/BUFFER SIZE(1634): 26907
07-03 15:01:19.764: E/To server:(1634): ANDROID_JPG_SEND
07-03 15:01:19.834: E/jpgIn(1634): Data received successfully.
07-03 15:01:19.834: E/HEY(1634): HEY
07-03 15:01:19.844: D/skia(1634): onFlyCompress
EDIT: ReceiverUtils.getFrame()
/*---------------------------------------
* getFrame(byte[] jpgData, int jpgSize): Decodes a byte array into a Bitmap and
* returns the Bitmap.
*/
public static Bitmap getFrame(byte[] jpgData, int jpgSize){
Bitmap res = null;
res = BitmapFactory.decodeByteArray(jpgData, 0, jpgSize);
Log.e("Decode success", String.valueOf(!(res == null)));
return res;
}
EDIT 2: Code before adding compression
//Compress jpgData and write result to jpgBaos
//jpgBaos = ReceiverUtils.compressFrame(imgRect, jpgData);
//Decode jpgData into Bitmap
//image = ReceiverUtils.getFrame(jpgBaos.toByteArray(), jpgSize);
image = ReceiverUtils.getFrame(jpgData, jpgSize);
EDIT 3: Image after saved to SD Card
The image that Android holds in 'jpgData' in the first code snippet is not the entire image. I can't post it due to being a new user. I added a 'writeToSD' method to test. I assume that the reason it cannot be decoded is because the majority of the image is just blank space, and only a portion of the image is there.
I'm working on a camera app on android. I'm currently taking my capture with the jpeg callback. I'd like to know if there's a way to get to raw data of the capture. I know there is a raw callback for the capture but it always return null.
So from the jpeg callback can I have access to raw data (succession of RGB pixels).
EDIT :
So from the jpeg callback can I have access to raw data (succession of YUV pixels).
I was successfully able to get a "raw" (YUV422) picture with android 5.1 running on a rk3288.
3 steps to get yuv image
init buffer
call addRawImageCallbackBuffer by relfexion
get the yuv picture in the dedicated callback
code sample
val weight = size.width * size.height * ImageFormat.getBitsPerPixel(ImageFormat.NV21) / 8
val bytes = ByteArray(weight);
val camera = android.hardware.Camera.open();
try {
val addRawImageCallbackBuffer = camera.javaClass
.getDeclaredMethod("addRawImageCallbackBuffer", bytes.javaClass)
addRawImageCallbackBuffer.invoke(camera, bytes)
} catch (e: Exception) {
Log.e("RNG", "Error", e);
}
...
camera.takePicture(null, { data, camera ->
val file = File("/sdcard/output.jpg")
file.createNewFile()
val yuv = YuvImage(data, ImageFormat.NV21, size.width, size.height, null)
yuv.compressToJpeg(Rect(0, 0, size.width, size.height), 80, file.outputStream())
}, null)
Explanation
The Camera.takePicture() method takes a callback for raw as second parameter.
camera.takePicture ( shutterCallback, rawCallback, jpegCallback );
This callback will return a null byteArray, unless I explicitly add a buffer for raw image first.
So, you're supposed to call camera.addRawImageCallbackBuffer for this purpose.
Nevertheless, this the method is not available (public but not exported, so you cannot call it directly).
Fortunately, the code sample demonstrates how to force call this method by reflection.
This will make the raw buffer to push a consistent yuv picture as a parameter.
My application is running out of memory which switching between two activities. The first activity is running an OpenGL scene, the second activity is not. I want to make sure I am releasing all of the textures used by the OpenGL scene.
Right now I am using this method
getNativeHeapAllocatedSize()
to track the relative amount of memory used by the textures. This number goes up by about 4 megs if I allocate textures. However it never seems to go back down again.
In my first activities 'OnPause' I have the following code:
SurfaceView.onPause();
mTexture = null;
In the second activity I then call getNativeHeapAllocatedSize() several times. Even after the GC has run and the memory still has not dropped.
Edit:
After more research it appears it is something with the code that loads the data. I have removed OpenGL from the equation and the memory is still not being released.
try {
InputStream is = null;
{
AssetManager am = MyActivity.getAssetMgr();
is = am.open( fileName );
}
Bitmap b = BitmapFactory.decodeStream( is );
if( b != null ) {
mResX = b.getWidth();
mResY = b.getHeight();
Bitmap.Config bc = b.getConfig();
if( bc == Bitmap.Config.ARGB_8888 )
mBPP = 4;
else
mBPP = 2;
mImageData = ByteBuffer.allocateDirect( mResX * mResY * mBPP );
mImageData.order( ByteOrder.nativeOrder() );
b.copyPixelsToBuffer( mImageData );
mImageData.position( 0 );
return true;
}
} catch (IOException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
return false;
}
Edit2:
I did end up adding in all your ideas. However this seemed to be the problem in my case...
ByteBuffer not releasing memory
I am assuming you mean textures loaded to GPU via gl.glTexImage* or any other helper method. In that case GC wont help you, it is not cleaning internal memory used by textures
Have you tried manually deleting your textures via gl.glDeleteTextures?
Edit according to new code:
Several leaks in your code:
close the input stream
recycle your bitmap after you have copied data to ByteBuffer
I guess you use the byteBuffer with image data to upload texture to GPU, make sure you do not store references to those buffers after data was uploaded.
I do not see any other problems in the this code, if after this fixes it still wont work, than look close at any bitmap usages in your app.