how to interface camera preview stream from android to Qt5? - android

Who knows how to interface camera preview stream data from android to Qt5?
I Want to display preview stream on Qt. and then I'll send the reformat stream to android.
Can anyone help?
Thanks!

If your are using QCamera, then you can set custom ViewFinder( class derived from QAbstractVideoSurface ) to QCamera.
void QCamera::setViewfinder(QAbstractVideoSurface * surface)
And then when stream is available,view finder's present method will get called with VideoFrame, from where you can get image data and do whatever you want.
bool QAbstractVideoSurface::present(const QVideoFrame & frame)

Related

Is there any way to get byte data from android camera2 for each frame?

As per old camera api by following the below code.
mCameraInstance.camera.addCallbackBuffer(imageBuffer);<br>
mCameraInstance.camera.setPreviewCallbackWithBuffer(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
processData(data);
if (mCameraInstance != null)
mCameraInstance.camera.addCallbackBuffer(imageBuffer);
}
});
I can get byte data each sec for every frame by using the above code. Is there any way to achieve this functionality in android camera2api.
Is there any way to achieve the similar funtionality in android 2
The simplest way would probably be to use an ImageReader hooked to the camera device. Attach an ImageReader.OnImageAvailableListener and get new images as they arrive. You can get the Plane[] image data and process it according to the format.

How to associate the Existing Surface to MediaCodec encoder surface?

I am bit confused after reading lot of resources in the internet.
1) I have a TextureView in the Application. ((tv))
2) I have associated SurfaceTextureLister to ((tv)).
tv.setSurfaceTextureListener(new TextureView.SurfaceTextureListener() {
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
mySurface = new Surface(surface); => This surface is sent to native layer.
....
}
3) Passing this mySurface surface into the native, I get ANativeWindow for this surface, and use ANativeWindow_lock and ANativeWindow_unlockAndPost to copy the data into the surface.
So far so good. it is displaying the data what i have copied into the ANativeWindow.
Now, i want to record all these frames into into MP4 format in a file.
What i did?
1)
I have used the below link for the reference:
http://www.bigflake.com/mediacodec/EncodeAndMuxTest.java.txt
2)
I have retrieived a surface from the mediacodec encoder, and passed to the native to copy the same way as i have copied into the display surface. I see no output in the mp4 file. just a black screen.
QUESTION:
1) Is this the right approach ? what i mean is you have a raw data,
copy the this data into two surfaces, one has come from application TextureView, and the other one is from mediacodec encoder surface.
2)
Or, Did i overlook any other easy way of recording the data into mp4 format ?
3)
Or, Any concepts did i miss completely to look at ?
Kindly provide your valuable input.

Android MediaCodec encode and decode using vp8 format

I want to develop an app, which will have 2 buttons and SurfaceView(Actually a class extends SurfaceView implements SurfaceHolder.Callback).
When user click on button1 - with ScreenCapture, I will get an image and using mediacodec, I will do vp8 encoding and the output saving in bytebuffer.(I am not saving in a file location)
When user click on button2 - I need to show it on SurfaceView with that output of bytebuffer which is captured.
i have tried as
MediaCodec decoder = MediaCodec.createDecoderByType("video/x-vnd.on2.vp8");
decoder.dequeueOutputBuffer(mBufferInfo, DEFAULT_TIMEOUT_US);
.....
but not able to update the surfaceview.
How can i update the surfaceview using bytebuffer data?
I got the answer...
mMediaCOdec.releaseOutputBuffer(index, true)... -- here i set render as false. If set render value as true, then i can able to draw or set the captured image.
With releaseOutputBuffer (int index, long renderTimestampNs) -- we can render the image. But supported in API level 21 only.
Thanks..

Editing android VideoView frames

Environment:
Nexus 7 Jelly Bean 4.1.2
Problem:
I'm trying to make a Motion Detection application that works with RTSP using VideoView.
I wish that there was something like an onNewFrameListener
videoView.onNewFrame(Frame frame)
I've tried to get access to the raw frames of an RTSP stream via VideoView but couldn't find any support for that in the Android SDK.
I found out that VideoView encapsulates the Android's MediaPlayer class.
So i dived into the media_jni lib to try and find a way to access the raw frames, But couldn't find the byte buffer or whatever that represents a frame.
Question:
Anyone has an idea where or how can i find this buffer and get access to it ?
Or any other idea of implementing a Motion Detection over a VideoView ?
Even if it's sais that i need to recompile the AOSP.
You can extend the VideoView and override its draw(Canvas canvas) method.
Set your bitmap to the canvas received through draw.
Call super.draw() which will get the frame drawn onto your bitmap.
Access the frame pixels from the bitmap.
class MotionDetectorVideoView extends VideoView {
public Bitmap mFrameBitmap;
...
#Override
public void draw(Canvas canvas) {
// set your own member bitmap to canvas..
canvas.setBitmap(mFrameBitmap);
super.draw(canvas);
// do whatever you want with mFrameBitmap. It now contains the frame.
...
// Allocate `buffer` big enough to hold the whole frame.
mFrameBitmap.copyPixelsToBuffer(buffer);
...
}
}
I don't know whether this will work. Avoid doing heavy calculation in draw, start a thread there.
In your case I would use the Camera Preview instead the VideoView, if you are working with live motion, not recorded videos. You can use a Camera Preview Callback to catch everyframe captured by your camera. This callback implements :
onPreviewFrame(byte[] data, Camera camera)
Called as preview frames are displayed.
Which I think it could be useful for you.
http://developer.android.com/reference/android/hardware/Camera.PreviewCallback.html
Tell if that is what you are searching for.
Good luck.

Android filtered video from camera

I can't understand how to display filtered video from camera on Android correctly...
I wrote for sdk-8, so I've used the scheme below:
Camera.setPreviewDisplay(null); // use null surface holder to identify the fact that I don't want to see raw camera preview.
Camera.setPreviewCallbackWithBuffer() + Camera.addCallbackBuffer() // to get camera data, modify it and draw on my GLSurfaceView
And this scheme is wonderful works on 2.2.* androids... and I had been happy, until didn't try application on 4.* =) my callback function for receive frame data doesn't called at all!
According documentation, I shouldn't use null as argument for setPreviewDisplay... without surface instance, video stream will not run... but if I give him surface he will start drawing camera raw preview on that surface....
The question is: How can I correctly draw filtered camera video by my self?!

Categories

Resources