DJI Phantom 3 camera having problems with openCV in Android Studio - android

I am trying to get the camera from DJI and use OpenCV with it, the problem relies on how to set OpenCv to get the video previewer that DJI is recording while the drone is active. The drone is actually working on streaming the video to my cellphone but when I try to use my OpenCV code to get the video preview id from the layout in the project I have in Android Studio, the app crashes every time I try to go to the camera view part of the app. Here is code that I use to initialize the OpenCv object to the video previewer captured by the DJI camera.
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
setContentView(R.layout.activity_main);
openCvCameraView = (JavaCameraView)findViewById(R.id.video_previewer_surface);
openCvCameraView.setVisibility(SurfaceView.VISIBLE);
openCvCameraView.setCvCameraViewListener(this);
initUI();
// The callback for receiving the raw H264 video data for camera live view
mReceivedVideoDataCallBack = new CameraReceivedVideoDataCallback() {
#Override
public void onResult(byte[] videoBuffer, int size) {
if(mCodecManager != null){
// Send the raw H264 video data to codec manager for decoding
mCodecManager.sendDataToDecoder(videoBuffer, size);
}else {
Log.e(TAG, "mCodecManager is null");
}
}
};
DJICamera camera = FPVDemoApplication.getCameraInstance();
if (camera != null) {
camera.setDJICameraUpdatedSystemStateCallback(new DJICamera.CameraUpdatedSystemStateCallback() {
#Override
public void onResult(CameraSystemState cameraSystemState) {
if (null != cameraSystemState) {
int recordTime = cameraSystemState.getCurrentVideoRecordingTimeInSeconds();
int minutes = (recordTime % 3600) / 60;
int seconds = recordTime % 60;
final String timeString = String.format("%02d:%02d", minutes, seconds);
final boolean isVideoRecording = cameraSystemState.isRecording();
MainActivity.this.runOnUiThread(new Runnable() {
#Override
public void run() {
recordingTime.setText(timeString);
/*
* Update recordingTime TextView visibility and mRecordBtn's check state
*/
if (isVideoRecording){
recordingTime.setVisibility(View.VISIBLE);
}else
{
recordingTime.setVisibility(View.INVISIBLE);
}
}
});
}
}
});
}
}

It may be that you are using JavaCameraView which according to this post: What is the difference between `opencv.android.JavaCameraView` and `opencv.android.NativeCameraView`
The org.opencv.android.JavaCameraView class is implemented inside OpenCV library. It is inherited from CameraBridgeViewBase, that extends SurfaceView and uses standard Android camera API.
You are using the video feed from the DJI SDK and not the phone's hardware camera so that may explain the crash as when you invoke OpenCV it is in conflict with the incoming feed.
As I don't have a drone, my only suggestion is to look at the other DJI sample on Video Stream Decoding
https://github.com/DJI-Mobile-SDK-Tutorials/Android-VideoStreamDecodingSample
And instead of decoding the stream send the data to OpenCV perhaps in JNI (C/C++).

Related

Xamarin Forms - Take photograph without any user interaction

I have a requirement to take a photograph of a user in Xamarin Forms without them having to press the shutter button. For example, when the app launches it should show a preview and count down from 5 seconds (to give the user chance to get in position) then take a picture automatically.
I have tried the Xamarin Media Plugin library however this stackoverflow post and this GitHub issue state that this feature is not a supported.
I have seen a number of dead discussions such as this with people asking similar questions without resoltion.
I tried the LeadTools AutoCapture sample but this only seems to work for documents/text and not people (unless I am missing something??).
I am now working my way through the Camera2Basic sample which is quite old and only targets Android via android.hardware.camera2.
Are there any samples out there (or 3rd party libraries) that can acheive this requirement? Ideally I would like it to be cross platform (iOS and Android) but currently the main focus is Android.
You can create the Custom View Renderer on Android to achieve that.
And based on this offical sample is more convenient, just modify code as follow can achieve your wants.
This official sample can preview camera view in Xamarin Forms App, we just need to add a Timer to call the Frame from Camera after 5 seconds.The modified Renderer code as follow:
public class CameraPreviewRenderer : ViewRenderer<CustomRenderer.CameraPreview, CustomRenderer.Droid.CameraPreview>, Camera.IPreviewCallback
{
CameraPreview cameraPreview;
byte[] tmpData;
public CameraPreviewRenderer(Context context) : base(context)
{
}
protected override void OnElementChanged(ElementChangedEventArgs<CustomRenderer.CameraPreview> e)
{
base.OnElementChanged(e);
if (e.OldElement != null)
{
// Unsubscribe
cameraPreview.Click -= OnCameraPreviewClicked;
}
if (e.NewElement != null)
{
if (Control == null)
{
cameraPreview = new CameraPreview(Context);
SetNativeControl(cameraPreview);
}
Control.Preview = Camera.Open((int)e.NewElement.Camera);
// Subscribe
cameraPreview.Click += OnCameraPreviewClicked;
}
}
protected override void OnAttachedToWindow()
{
base.OnAttachedToWindow();
// call the timer method to get the current frame.
Device.StartTimer(new TimeSpan(0, 0, 5), () =>
{
// do something every 5 seconds
Device.BeginInvokeOnMainThread(() =>
{
Console.WriteLine("get data"+tmpData);
// using MessagingCenter to pass data to forms
MessagingCenter.Send<object, byte[]>(this, "CameraData", tmpData);
cameraPreview.Preview.StopPreview();
cameraPreview.IsPreviewing = false;
// interact with UI elements
});
return false; // runs again, or false to stop
});
}
void OnCameraPreviewClicked(object sender, EventArgs e)
{
if (cameraPreview.IsPreviewing)
{
cameraPreview.Preview.StopPreview();
cameraPreview.IsPreviewing = false;
}
else
{
cameraPreview.Preview.SetPreviewCallback(this);
cameraPreview.Preview.StartPreview();
cameraPreview.IsPreviewing = true;
}
}
protected override void Dispose(bool disposing)
{
if (disposing)
{
Control.Preview.Release();
}
base.Dispose(disposing);
}
// get frame all the time
public void OnPreviewFrame(byte[] data, Camera camera)
{
tmpData = data;
}
}
Now, Xamarin Forms can receive the data from MessagingCenter:
MessagingCenter.Subscribe<object, byte[]>(this, "CameraData", async (sender, arg) =>
{
MemoryStream stream = new MemoryStream(arg);
if (stream != null)
{
//image is defined in Xaml
image.Source = ImageSource.FromStream(() => stream);
}
});
image is defined in XAML: <Image x:Name="image" WidthRequest="200" HeightRequest="200"/>

Problems with playing videos ARCore

I followed the example code listed on the AugmentedImageController for ARCore unity on github at: https://github.com/google-ar/arcore-unity-sdk/blob/master/Assets/GoogleARCore/Examples/AugmentedImage/Scripts/AugmentedImageExampleController.cs. Even after I've followed the code on this example, it doesn't play the video from the video player as shown on the AugmentedImageVisualizer code below:
The video plays if I drag and drop the AugmentedImageVirtulizer onto the scene and put playOnAwake. However it doesn't play when I take the playOnAwake off, send the app to my phone, and then point the camera to the augmented image (in my case a empty milk bottle label). I want an object such as a ghost to appear coming out of the milk bottle.
using GoogleARCore;
using UnityEngine;
using UnityEngine.Video;
public class AugmentedImageVisualizer : MonoBehaviour {
private VideoPlayer vidPlayer;
public VideoClip[] vidClips;
public AugmentedImage Image;
// Use this for initialization
void Start () {
vidPlayer = GetComponent<VideoPlayer>();
vidPlayer.loopPointReached += OnStop;
}
private void OnStop(VideoPlayer source)
{
gameObject.SetActive(false);
}
// Update is called once per frame
void Update () {
if (Image == null || Image.TrackingState != TrackingState.Tracking)
{
return;
}
if (!vidPlayer.isPlaying)
{
vidPlayer.clip = vidClips[Image.DatabaseIndex];
vidPlayer.Play();
}
transform.localScale = new Vector3(Image.ExtentX, Image.ExtentZ,
1f);
}
}
no console errors showing, but no videos showing
Be sure that your camera's position is in the origo.
Is your videoplayer active? (videoplayer.setactive(true))
My working solution:
public class AugmentedImageVisualizer : MonoBehaviour
{
private const string TAG = "AugmentedImageVisualizer";
public AugmentedImage image;
public GameObject video;
public VideoClip[] videoClips;
private VideoPlayer videoPlayer;
private void Start() {
videoPlayer = video.GetComponent<VideoPlayer>();
}
private void Update() {
if (Session.Status != SessionStatus.Tracking) return;
if (image == null || image.TrackingState != TrackingState.Tracking) {
videoplayer.SetActive(false);
return;
}
UpdateVideo();
private void UpdateVideo() {
if (!videoPlayer.isPlaying) {
videoPlayer.clip = videoClips[image.DatabaseIndex];
videoPlayer.Play();
video.SetActive(true);
}
transform.localScale = new Vector3(image.ExtentX, 1, image.ExtentZ);
}
}
Don't forget to add videoplayer component to your gameobject.
EDIT: I had to edit answer to give more explanation:
I used the examples of augmented image that comes with GoogleARCore to create the AR. Here the controller needs an augmented image visualizer prefab. This prefab is a Quad (right click on hierarchy area, then 3D Object > Quad) and moved it to prefab folder (this creates prefab from quad). This quad/prefab has a videoplayer (added in inspector). This quad(prefab) also has a script (augmentedimagevisualizer) which contains your code snippet. So in the inspector (with augmentedimagevisualizer script) the quad already has videoclips where you can set your videos.
In hierarchy, there is an arcore device and inside is the camera. This camera has Tracked Pose Driver (set it in inspector) with Pose source: Color Camera and AR Core Background Renderer script.
The 2. videolink makes the same and contains your code as well, so this video is very descriptive regarding your question.
I found 2 similar video on youtube.
that shows 1 video: https://www.youtube.com/watch?v=yjj5dV2v9Fs
that shows multiple videos on multiple ref. images:
https://www.youtube.com/watch?v=GkzMFNmvums
The 2. was tricky for me, the guy in video created quad that he renamed to AugmentedImageVisualizer and then he placed it in Prefabs. Since I've realized this my videos appear on ref. images.
I used Unity 2019.3.15 and arcore-unity-sdk-1.17.0.unitypackage

How to restart camera preview on Xamarin forms when using a custom renderer view for a camera preview

I am trying to resume my Camera preview with android after putting the app to sleep or changing between apps. Or even starting a different app which uses the camera feature but the Camera crashed with getParameters() being null.
Is there a way to retrieve the control other the camera preview when resuming using the Xamarin forms application.
I have tried to use Camera.Restart() and didn't work.
public void SurfaceCreated(ISurfaceHolder holder)
{
try
{
if (Preview != null)
{
Preview.StopPreview();
Preview.Reconnect();
Preview.SetPreviewDisplay(holder);
Preview.EnableShutterSound(true);
}
}
catch (Exception ex)
{
System.Diagnostics.Debug.WriteLine(#" ERROR: ", ex.Message);
}
}
public void SurfaceDestroyed(ISurfaceHolder holder)
{
Preview.StopPreview();
Preview.Release();
}
public void SurfaceChanged(ISurfaceHolder holder, Android.Graphics.Format format, int width, int height)
{
Camera.Parameters parameters = Preview.GetParameters();
parameters.FocusMode = Camera.Parameters.FocusModeContinuousPicture;
IList<Camera.Size> previewSizes = parameters.SupportedPreviewSizes;
// You need to choose the most appropriate previewSize for your app
Camera.Size previewSize = previewSizes[0];
parameters.SetPreviewSize(previewSize.Width, previewSize.Height);
Preview.SetParameters(parameters);
Preview.StartPreview();
}
I was able to get it to work by reading more thoroughly about the behaviour of the Android camera(Camera1) hardware.
If you are working on Xamarin and trying to create camera view within the app the best way to do it is making a custom renderer and create a camera view in each platform. Like shown here:
https://learn.microsoft.com/en-ca/xamarin/xamarin-forms/app-fundamentals/custom-renderer/view
but this example only shows how to create the camera preview, there is no camera hardware lifecycle or taking a pictures included.
To solve the issue for the question above I had simply to do Camera.Open(0) to gain control over the camera again within the lifecycle of Xamarin forms pages.
Here is what I did(in CameraPreview class in Xamarin Forms):
Created Camera open event handler:
public event EventHandler CloseCameraRequest;
Created a method to invoke the event:
public void OpenCamera()
{
OpenCameraRequest?.Invoke(this, EventArgs.Empty);
}
Registered the handler in the android camera native class:
protected override void OnElementChanged(ElementChangedEventArgs<CameraPreview> e)
{
base.OnElementChanged(e);
if (Control == null)
{
_nativeCameraPreview = new NativeCameraPreview(Context);
_nativeCameraPreview.PhotoCaptured += OnPhotoCaptured;
SetNativeControl(_nativeCameraPreview);
}
Control.Preview = Camera.Open(0);
if (e.OldElement != null)
{
e.NewElement.OpenCameraRequest -= OnOpenCameraRequest;
}
if (e.NewElement != null)
{
e.NewElement.OpenCameraRequest += OnOpenCameraRequest;
}
}
private void OnOpenCameraRequest(object sender, EventArgs e)
{
Control.Preview = Camera.Open(0);
}
Invoked the request all the way from Xamarin forms page OnAppearing method:
protected override void OnAppearing()
{
base.OnAppearing();
CameraPreview.OpenCamera();
}
This fixed the issue of resuming camera preview after opening other application which uses the camera or putting the app to sleep where camera preview will timeout.

Use two DjiCodecManager at the same time

My drone matrice 210.
DJI Android SDK 4.7.1
Device CrystalSky CS785, Android 5.1.1
I shuld display video stream from two camers at the same time, like a DJI Pilot.
My solutions:
I create two diferance DjiCodecManager, and use it in diferent VideoFeeder callbaks.
DJICodecManager primaryDJICodecManager = new DJICodecManager(Activity,
pramirySurfaceTexture,
pramirySurfaceTextureTextureWidth,
pramirySurfaceTextureTextureHeight);
DJICodecManager secondaryDJICodecManager = new DJICodecManager(Activity,
secondarySurfaceTexture,
secondarySurfaceTextureTextureWidth,
secondarySurfaceTextureTextureHeight);
pramirySurfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {
#Override
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
surfaceTexture.updateTexImage();
}
});
secondarySurfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {
#Override
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
surfaceTexture.updateTexImage();
}
});
VideoFeeder.VideoFeed videoFeed = VideoFeeder.getInstance().getPrimaryVideoFeed();
VideoFeeder.VideoFeed secondaryVideoFeed = VideoFeeder.getInstance().getSecondaryVideoFeed();
secondaryVideoFeed.setCallback(new VideoFeeder.VideoDataCallback() {
#Override
public void onReceive(byte[] videoBuffer, int size) {
if (DjiManagers.mSecondaryCodecManager != null) {
secondaryDJICodecManager.sendDataToDecoder(videoBuffer, size);
}
}
});
videoFeed.setCallback(new VideoFeeder.VideoDataCallback() {
#Override
public void onReceive(byte[] videoBuffer, int size) {
if (DjiManagers.mCodecManager != null) {
primaryDJICodecManager.sendDataToDecoder(videoBuffer, size);
}
}
});
But the pramirySurfaceTexture callback does not work. And on the second texture, an image from different cameras (color and grayscale (I use a thermal imaging camera)) appears alternately, but most often the texture is green.
Is it possible to create and use two DJICodecManager?
And if not, how can I show the video stream simultaneously?
DJI support answered me.
To use two DJICodecManafers. You must use the other constructor:
primaryDJICodecManager = new DJICodecManager(Activity,
djiSdkWrapper.getSurfaceTexture(),
djiSdkWrapper.getSurfaceTextureWidth(),
djiSdkWrapper.getSurfaceTextureHeight(),
videoStreamSource);
where videoStreamSource it's one of this:
UsbAccessoryService.VideoStreamSource.Camera
UsbAccessoryService.VideoStreamSource.Fpv
UsbAccessoryService.VideoStreamSource.SecondaryCamera
And when you send data to decoding you must use onother sendDataToDecode methond:
primaryDJICodecManager.sendDataToDecoder(array, size, index);
where intdex it's one of this:
UsbAccessoryService.VideoStreamSource.Camera.getIndex()
UsbAccessoryService.VideoStreamSource.Fpv.getIndex()
UsbAccessoryService.VideoStreamSource.SecondaryCamera.getIndex()
In accordance with what you specified when creating DJICodecManager.

How to change Vuforia AR camera focus mode?

I am using Vuforia 6.2 AR SDK for in Unity. But while I test the application in Android phone the camera seems like blurry. I searched in Vuforia's developer website and found some camera focus mode but I can't implement because that guideline was for older Vuforia SDK, I can't find the script they mentioned in their website. Here is their code sample but it's not working. I created different script and run this line on Start() function, but still not working.
CameraDevice.Instance.SetFocusMode(
CameraDevice.FocusMode.FOCUS_MODE_CONTINUOUSAUTO);
try this
void Start ()
{
VuforiaARController.Instance.RegisterVuforiaStartedCallback(OnVuforiaStarted);
VuforiaARController.Instance.RegisterOnPauseCallback(OnPaused);
}
private void OnVuforiaStarted()
{
CameraDevice.Instance.SetFocusMode(
CameraDevice.FocusMode.FOCUS_MODE_CONTINUOUSAUTO);
}
private void OnPaused(bool paused)
{
if (!paused) // resumed
{
// Set again autofocus mode when app is resumed
CameraDevice.Instance.SetFocusMode(
CameraDevice.FocusMode.FOCUS_MODE_CONTINUOUSAUTO);
}
}
This code is the right code.
bool cameramode = false;
public void OnCameraChangeMode()
{
Vuforia.CameraDevice.CameraDirection currentDir = Vuforia.CameraDevice.Instance.GetCameraDirection();
if (!cameramode) {
RestartCamera(Vuforia.CameraDevice.CameraDirection.CAMERA_FRONT);
camBtnTxt.text = "Back Camera";
} else {
RestartCamera(Vuforia.CameraDevice.CameraDirection.CAMERA_BACK);
camBtnTxt.text = "Front Camera";
}
}
private void RestartCamera(Vuforia.CameraDevice.CameraDirection newDir)
{
Vuforia.CameraDevice.Instance.Stop();
Vuforia.CameraDevice.Instance.Deinit();
Vuforia.CameraDevice.Instance.Init(newDir);
Vuforia.CameraDevice.Instance.Start();
}

Categories

Resources