how to open the camera using OpenCV for Android VideoCapture - android

I am working wiht OpenCV4Android, and I am trying to use VideoCapture class to open the Android Camera and perform further processings on each captured frame.

Hi i'm working on android with opencv and i'm sorry to tell you that you can not open a stream with opencv in cpp. The ndk android dont give any API to access the camera so opencv can not open any stream. I saw one time an API for android 4.4 if i remember well but i did not succeed to open any thing.
Since the realase of android 7.0 you have access to some C function that give you the right to take a picture, check out this header: camera/NdkCameraManager.h.
And if you whant a beginning of code
#include <camera/NdkCameraManager.h>
#include <android/log.h>
#define LOGI(...) ((void)__android_log_print(ANDROID_LOG_INFO, "gandoulf", __VA_ARGS__))
#define LOGW(...) ((void)__android_log_print(ANDROID_LOG_WARN, "gandoulf", __VA_ARGS__))
void AndroidCamera()
{
ACameraIdList *cameraList; //list of available camera
ACameraManager *cameraManager; // android camera manager
camera_status_t cameraStatus; // enum for the error while using camera
cameraManager = ACameraManager_create(); // instantiate the camera manager
cameraStatus = ACameraManager_getCameraIdList(cameraManager, &cameraList); // get the list of available camera, return enum camera_status_t for the error
if (cameraStatus == ACAMERA_OK) {
LOGI("cameraList ok\n");
LOGI("num of camera = %d", cameraList->numCameras);
}
else
LOGW("ERROR with cameraList\n");
}
With that you have the list of camera and you can normaly take a picture with function that you can find in the header.

Related

Keep the phone screen ON in Cocos2dx

I created a game for Android using Cocos2DX 3.4. I'm using the acelerometer for the player to move around the screen so I don't need to touch the screen. The problem is that the screen turn off when I play for a while. I need to know how to keep the phone awake even if I don't touch the screen.
Just Write
cocos2d::Device::setKeepScreenOn(true);
on the First Scene you load.
I found one solution, but I´m still waiting for a better one if any:
Just added this to my CPP:
#if (CC_TARGET_PLATFORM == CC_PLATFORM_ANDROID)
#include "../cocos2d/cocos/platform/android/jni/Java_org_cocos2dx_lib_Cocos2dxHelper.h"
#endif
Then I added this init method to my main Scene:
bool HelloWorld::init()
{
if ( !LayerColor::initWithColor( Color4B(204,204,204,255)))
{
return false;
}
#if (CC_TARGET_PLATFORM == CC_PLATFORM_ANDROID)
setKeepScreenOnJni(true);
#endif
return true;
}

Displaying texture in android using OPENGL ES 2.0 AND SDL 2.0 in c(native code)

I am trying to display texture in android screen using OPENGL ES 2.0 library along with SDL 2.0.Since i am getting frames from ffmpeg,it was suggested that using OPENGL library we can reduce the delay during texture updation.Now i am having my frames ready to update the screen.So can anyone tell me how to display those raw frames from ffmpeg to android display?Here is my sample code where i got problem,i don't know how to proceed after this.
SDL_Window *screen=0;
SDL_GLContext maincontext=0;
SDL_Surface *image=NULL;
if (SDL_Init(SDL_INIT_EVERYTHING)) {
LOGD( "Could not initialize SDL - %s\n", SDL_GetError());
SDL_Quit();
exit(1);
}
LOGD(" SDL Initialized..");
//OPENGL intialization:-
SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_ES);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION,3);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION,0);
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER,1);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE,24);
SDL_GL_SetAttribute(SDL_GL_ACCELERATED_VISUAL,1);
//SDL_GL_SetAttribute(SDL_GL_SHARE_WITH_CURRENT_CONTEXT,0);
screen = SDL_CreateWindow("Window",SDL_WINDOWPOS_UNDEFINED,SDL_WINDOWPOS_UNDEFINED,0,0,SDL_WINDOW_FULLSCREEN|SDL_WINDOW_OPENGL|SDL_WINDOW_RESIZABLE);
LOGD("SDL Screen Created ..");
//image=SDL_LoadBMP("img.bmp");
maincontext=SDL_GL_CreateContext(screen);
/* Clear our buffer with a red background */
glViewport(0,0,240,240);
glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT);

how to interface camera preview stream from android to Qt5?

Who knows how to interface camera preview stream data from android to Qt5?
I Want to display preview stream on Qt. and then I'll send the reformat stream to android.
Can anyone help?
Thanks!
If your are using QCamera, then you can set custom ViewFinder( class derived from QAbstractVideoSurface ) to QCamera.
void QCamera::setViewfinder(QAbstractVideoSurface * surface)
And then when stream is available,view finder's present method will get called with VideoFrame, from where you can get image data and do whatever you want.
bool QAbstractVideoSurface::present(const QVideoFrame & frame)

Android: open camera and get raw data in C level

I want to open android tablet's camera and get the data from camera in C level. After that I will modify the data, and C level will be efficient.
Now I'm thinking using the V4L2 C code. But I find the open function of V4L2 need the parameter of the camera's name, such as '/dev/video0'. However I can't find something like that in my tablet's dev folder. Besides, I am not sure whether using the V4L2 will be the right solution.
Does anyone know anything about this?
on my device "OpenCV for Android" does not provide required performance neither in 'native' mode nor in 'java' mode. it gives FPS=2 in 1920x1080, in same time when java MediaRecorder can record 1920x1080 with FPS=15
I'm trying to solve it using the code from Android Open Source Project used by native Camera application:
static void android_hardware_Camera_native_setup(JNIEnv *env, jobject thiz,
jobject weak_this, jint cameraId)
{
sp<Camera> camera = Camera::connect(cameraId);
if (camera == NULL) {
jniThrowRuntimeException(env, "Fail to connect to camera service");
return;
}
// make sure camera hardware is alive
if (camera->getStatus() != NO_ERROR) {
jniThrowRuntimeException(env, "Camera initialization failed");
return;
}
jclass clazz = env->GetObjectClass(thiz);
if (clazz == NULL) {
jniThrowRuntimeException(env, "Can't find android/hardware/Camera");
return;
}
// We use a weak reference so the Camera object can be garbage collected.
// The reference is only used as a proxy for callbacks.
sp<JNICameraContext> context = new JNICameraContext(env, weak_this, clazz, camera);
context->incStrong(thiz);
camera->setListener(context);
// save context in opaque field
env->SetIntField(thiz, fields.context, (int)context.get());
}
You can always build a JNI method for the Java classes to get access from C. Another way could be using OpenCV for Android: OpenCV4Android
This gives you a interface to the camera, but as far as I remember, there is currently no support for Android 4.3+.

combine videoplayback+image target in vuforia

I able to show 2 kind of 3D object on 1 marker in image target sample, but now could I combine video playback + imaget target using android, too many parameters confuse me.
any help will be appreciate
First read carefully both vuforia SDK tutorials for image and video playback.
open Imagetarget.cpp and update the finction Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargetsRenderer_renderFrame()
according to your functionality.
here you can add video playback code.
change this function:
JNIEXPORT void JNICALL
Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargetsRenderer_renderFrame(JNIEnv *, jobject)
{
//LOG("Java_com_qualcomm_QCARSamples_ImageTargets_GLRenderer_renderFrame");// Clear color and depth buffer
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
enter code here
// Get the state from QCAR and mark the beginning of a rendering section
QCAR::State state = QCAR::Renderer::getInstance().begin();
// Explicitly render the Video Background
QCAR::Renderer::getInstance().drawVideoBackground();
#ifdef USE_OPENGL_ES_1_1en`enter code here`ter code here`
// Set GL11 flags:`enter code here`
}

Categories

Resources