I want to make some change on the video that come from the camera.
So i using class that extends the class SurfaceView and implements SurfaceHolder.Callback.
Now, i still don't find any way to make this change.
How can i do it?
you can try SurfaceTexture instead of SurfaceView, and implements interface SurfaceTexture.OnFrameAvailableListener with method onFrameAvailable(...). On arrival of video frame, Surface will call back this method, you can get the current frame data.
Pls refer to class PanoramaActivity of Android Camera APK source code for sampling code.
This is complicated and difficult to do in realtime. Basically you need to grab the camera data, modify it, then write it to the SurfaceView. And you realistically have half a second to do it, otherwise the lag is unbearable.
Apps that overlay things on a camera view (think of the ZXing barcode scanner) typically do it by providing a view bound to the camera, then grabbing a second copy of the camera image data a few times per second and overlaying additional data on top of the first view.
Related
I have been working on modifying libstreaming so that I can stream a surfaceview up to Wowza instead of the camera. I feel that I am now at the final stage and just missing one piece that I cannot seem to figure out.
Here is what I need to replace. Everything works correctly when I use the camera and do the following,
cameraInstance.setPreviewTexture(glSurfaceView.getSurfaceTexture());
cameraInstance.startPreview();
glSurfaceView is a custom class that contains code to create a texture with glGenTextures method.
At this stage, I can start the app and images is transmitted to Wowza via libstreaming. All happy.
Now, I want to replace the cameraInstance with my own code to draw directly to the texture. How do I go about doing that? How does the camera do it to draw directly to the texture?
First, I've just started with android development last week so please be thorough in your explanations as I'm still a noob.
I've managed to create an app that uses the JavaCameraView to show the user what the back camera is seeing. I created a new button in the activity bar to take a picture. When the user clicks this button I want to capture that frame and then send it to the picture library I am using for the facerecognizer. Thus far I haven't been able to succeed with this implementation.
So for the questions...
How can I capture a frame from the JavaCameraView when the take picture button is pressed?
From there do I just output the image to my image library using OutputStream?
Thanks everyone
In your class you have to add implements CvCameraViewListener2. Now your class has a method public Mat onCameraFrame(CvCameraViewFrame cameraviewframe).
I'm trying to do camera recording and drawings on top on Google Glass, using a LiveCard.
in a regular activity, this would be achieved by using a FrameLayout, with a SurfaceView for the camera preview 'in the back', and another View in front of it used for drawing.
but using a LiveCard, if one needs subsecond updates, one has to use the LiveCard itself as a Surface. according to the documentation: https://developers.google.com/glass/develop/gdk/reference/com/google/android/glass/timeline/LiveCard
If your application requires more frequent updates (several times per
second) or rendering more elaborate graphics than the standard widgets
support, enable direct rendering and add a SurfaceHolder.Callback to
the card's surface.
LiveCard liveCard; // initialized elsewhere
liveCard.setDirectRenderingEnabled(true);
liveCard.getSurfaceHolder().addCallback(callback); You can then draw directly on the surface inside a background thread or in response
to external events (for example, sensor or location updates). Use the
surfaceCreated and surfaceDestroyed methods to start and stop your
rendering logic when the card is displayed or hidden.
now I can either draw my on stuff on this Surface, or I can give this to the MediaRecorder as the camera preview service, but can't do both, as it will fail with an error
I wonder if anyone has ideas on how to make this work still?
the way I'd draw into the LiveCard myself is to 'manually' lock the canvas, and call FrameLayout.draw(canvas); one option would be to have a layout that contains two SurfaceViews - one for the camera preview, and one for my own drawings, and use the same approach. but, even if I define such a layout in XML, I can't get the SurfaceViews created (e.g. the appropriate SurfaceView callbacks are never called, and any attempt of drawing on them results in failure)
In a camera preview function, I've been using yuv2rgb, and using the resulting bitmap.
This is slow, so I want to display the picture as it is.
I use example class
// public abstract class ViewBase extends SurfaceView implements
SurfaceHolder.Callback, Runnable{}
Not sure why you want to render the preview frames yourself, given that the OS already has optimized flow for the preview frame from the camera driver to the video driver.
However, if you need to do it yourself, you can use OpenGL to create a YUV texture and then blit it to a plane. Check this SO question for sample code.
On some devices, onPreviewFrame is not called if no SurfaceView was set to display the camera preview.
However, I handle the camera in a service, so I can't set a SurfaceView but I don't want to have visible preview anyway.
How can this be done? Can I programmatically create a SurfaceView and set it with Camera::setPreviewDisplay?
This must be possible or not?
It works on almost every phone without a SurfaceView but not on HTC One X and Google Nexus One...
According to this question, creating a SurfaceView in code works fine. Though I don't think you can create it through a service.
Another approach is to create a 1px-1px SurfaceView inside a RelativeLayout and hiding it with some other view on top of it. (visibility should still be VISIBLE). We use this trick for Path camera UI where we render preview buffers through OpenGL and it works fine.
According to documentation readily configured, visible and displayed surface view is necessary to activate camera preview. It may be overlayued though
From API 11 on, you can use a SurfaceTexture instead of a SurfaceView to get frames from the camera. Then, instead of using Camera.setPreviewDisplay, simply use Camera.setPreviewTexture.
This answer as well as this one discuss this point.