How can I use OpenCV with stored images, without a camera? - android

How can I use OpenCV to process process some images saved on a smartphone, without using JavaCameraView?
I want to process an image saved on the SD card and then show the result of the process on the screen. I implemented my according to the tutorials from opencv4android libraries and they use the method onCameraFrame to show the image and implement the CameraViewListener and use CameraBridgeViewBase. However, I only want to process an image, I don´t want to use the camera to capture images and I think those elements may be unnecessary.
How can I change the opencv4android libraries and process stored images using OpenCV without using the camera?

If someone is still looking for an answer:
package com.example.ocv4androidwithoutcamera;
import org.opencv.android.BaseLoaderCallback;
import org.opencv.android.InstallCallbackInterface;
import org.opencv.android.LoaderCallbackInterface;
import org.opencv.android.OpenCVLoader;
import org.opencv.core.Mat;
import android.os.Bundle;
import android.app.Activity;
import android.util.Log;
import android.widget.Toast;
public class MainActivity extends Activity implements LoaderCallbackInterface {
protected BaseLoaderCallback mOpenCVCallBack = new BaseLoaderCallback(this) {
#Override
public void onManagerConnected(int status) {
switch (status) {
case LoaderCallbackInterface.SUCCESS:
{
onOpenCVReady();
} break;
default:
{
super.onManagerConnected(status);
} break;
}
}
};
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
}
#Override
protected void onResume()
{
super.onResume();
Log.i("DEMO", "Trying to load OpenCV library");
if (!OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_4, this, mOpenCVCallBack))
{
Log.e("DEMO", "Cannot connect to OpenCV Manager");
}
}
protected void onOpenCVReady(){
//this should crash if opencv is not loaded
Mat img = new Mat();
Toast.makeText(getApplicationContext(), "opencv ready", Toast.LENGTH_LONG).show();
}
#Override
public void onManagerConnected(int status) {
// TODO Auto-generated method stub
}
#Override
public void onPackageInstall(int operation,
InstallCallbackInterface callback) {
// TODO Auto-generated method stub
}
}
Don’t forget to add opencv library in Project Properties => Android => Library.

Related

How to change camera exposure on android?

I want to perform image processing with OpenCV and android. In the first step, I need to change the camera properties like resolution, exposure, etc. By using OpenCV I only can change the resolution(mOpenCvCameraView.setMaxFrameSize(320,240);) and cannot change exposure.
By using OpenCV and camera2 When I run it, it is crashing(this code:pastebin.com/3XgvKGQN).
How can I change camera exposure?
package com.williams.drew.opencvtest;
import android.graphics.Paint;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraMetadata;
import android.hardware.camera2.CaptureRequest;
import android.os.Bundle;
import android.support.annotation.NonNull;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.view.SurfaceView;
import android.view.WindowManager;
import org.opencv.android.JavaCameraView;
import org.opencv.android.BaseLoaderCallback;
import org.opencv.android.CameraBridgeViewBase;
import org.opencv.android.CameraBridgeViewBase.CvCameraViewFrame;
import org.opencv.android.CameraBridgeViewBase.CvCameraViewListener2;
import org.opencv.android.LoaderCallbackInterface;
import org.opencv.android.OpenCVLoader;
import org.opencv.core.Core;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.imgproc.Imgproc;
public class MainActivity extends AppCompatActivity implements CvCameraViewListener2 {
//Prefixes for logging success and failure messages
private static final String TAG = "OCVSample::Activity";
//Loads camera view of OpenCV for us to use. This lets us see using OpenCV
private CameraBridgeViewBase mOpenCvCameraView;
//Preview Builder which changes exposure (i think)
private CaptureRequest.Builder mPreviewRequestBuilder;
private CaptureRequest mPreviewRequest;
private long exposureTime = 1000,frameDuration = 1000;
private int sensitivity = 200;
//OPENCV Variables
Mat matRGBA;
public MainActivity() {
Log.i(TAG, "Instantiated new " + this.getClass());
}
#Override
protected void onCreate(Bundle savedInstanceState) {
Log.i(TAG, "called onCreate");
super.onCreate(savedInstanceState);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
setContentView(R.layout.show_camera);
mOpenCvCameraView = (JavaCameraView) findViewById(R.id.show_camera_activity_java_surface_view);
mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);
mOpenCvCameraView.setCvCameraViewListener(this);
}
#Override
public void onPause() {
super.onPause();
if(mOpenCvCameraView != null) {
mOpenCvCameraView.disableView();
}
}
#Override
public void onResume() {
super.onResume();
if(!OpenCVLoader.initDebug()) {
Log.d(TAG, "Internal OpenCV library not found. Using OpenCV Manager for init");
OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_3_1_0, this, mLoaderCallback);
}
else {
Log.d(TAG, "OpenCV library found inside package. Using it!");
mLoaderCallback.onManagerConnected(LoaderCallbackInterface.SUCCESS);
}
}
#Override
public void onDestroy() {
super.onDestroy();
if(mOpenCvCameraView != null) {
mOpenCvCameraView.disableView();
}
}
private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
#Override
public void onManagerConnected(int status) {
switch (status) {
case LoaderCallbackInterface.SUCCESS:
{
Log.i(TAG, "OpenCV loaded successfully");
mOpenCvCameraView.enableView();
} break;
default:
{
super.onManagerConnected(status);
} break;
}
}
};
#Override
public void onCameraViewStarted(int width, int height) {
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_OFF);
mPreviewRequestBuilder.set(CaptureRequest.SENSOR_EXPOSURE_TIME, Long.valueOf(exposureTime));
mPreviewRequestBuilder.set(CaptureRequest.SENSOR_SENSITIVITY, Integer.valueOf(sensitivity));
mPreviewRequestBuilder.set(CaptureRequest.SENSOR_FRAME_DURATION, Long.valueOf(frameDuration));
matRGBA = new Mat(width, height, CvType.CV_8UC4);
}
#Override
public void onCameraViewStopped() {
matRGBA.release();
}
#Override
public Mat onCameraFrame(CvCameraViewFrame inputFrame) {
matRGBA = inputFrame.rgba();
return matRGBA;
}
}
Thank you for your answers.
Yes, OpenCV is not exposing all the camera parameters. You can modify JavaCameraView and add the function that calls setExposureCompensation().
You want to call that function like this:
Camera mCamera;
mCamera = Camera.open();
Camera.Parameters params = mCamera.getParameters();
params.setExposureCompensation(0);

No menu when running openCV samples (Android Studio)

Running openCV-Android samples does not work as expected. I have Android Studio on Windows, extracted OpenCV-Android 2.4.11 and the sample is tutorial-1-camerapreview (but other samples don't seem to work also)
If I use 'Import module' and run Tutorial1CameraView on a device, everything works fine. I have the 3 normal buttons (link) PLUS a 4th one, which opens a menu.
If I use 'New project' and do as if I coded it myself (copy/pasting the code and files from the tutorial), it runs the same app, but there is no 4th button. So I can't open the menu.
Here is exactly what I've done :
New project -> blank activity
Import the openCV libs (I followed the instructions on this page)
Copy/paste the source code of Tutorial1CameraView in my MainActivity
Merge the res folder from tutorial with res folder created by Android studio. I deleted some files, like the old menu folder
(generated when I created the project, and not used by the tutorial),
to be sure they don't interfere. (But it still doesn't work if I keep
them)
Renaming some trivial things like package/classes in the java and xml files to make it compile
if it helps, my java code is :
import org.opencv.android.BaseLoaderCallback;
import org.opencv.android.CameraBridgeViewBase.CvCameraViewFrame;
import org.opencv.android.LoaderCallbackInterface;
import org.opencv.android.OpenCVLoader;
import org.opencv.core.Mat;
import org.opencv.android.CameraBridgeViewBase;
import org.opencv.android.CameraBridgeViewBase.CvCameraViewListener2;
import android.app.Activity;
import android.os.Bundle;
import android.util.Log;
import android.view.Menu;
import android.view.MenuItem;
import android.view.SurfaceView;
import android.view.WindowManager;
import android.widget.Toast;
public class MainActivity extends Activity implements CvCameraViewListener2 {
private static final String TAG = "OCVSample::Activity";
private CameraBridgeViewBase mOpenCvCameraView;
private boolean mIsJavaCamera = true;
private MenuItem mItemSwitchCamera = null;
private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
#Override
public void onManagerConnected(int status) {
switch (status) {
case LoaderCallbackInterface.SUCCESS:
{
Log.i(TAG, "OpenCV loaded successfully");
mOpenCvCameraView.enableView();
} break;
default:
{
super.onManagerConnected(status);
} break;
}
}
};
public MainActivity() {
Log.i(TAG, "Instantiated new " + this.getClass());
}
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
Log.i(TAG, "called onCreate");
super.onCreate(savedInstanceState);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
setContentView(R.layout.tutorial1_surface_view);
if (mIsJavaCamera)
mOpenCvCameraView = (CameraBridgeViewBase) findViewById(R.id.tutorial1_activity_java_surface_view);
else
mOpenCvCameraView = (CameraBridgeViewBase) findViewById(R.id.tutorial1_activity_native_surface_view);
mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);
mOpenCvCameraView.setCvCameraViewListener(this);
}
#Override
public void onPause()
{
super.onPause();
if (mOpenCvCameraView != null)
mOpenCvCameraView.disableView();
}
#Override
public void onResume()
{
super.onResume();
OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_3, this, mLoaderCallback);
}
public void onDestroy() {
super.onDestroy();
if (mOpenCvCameraView != null)
mOpenCvCameraView.disableView();
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
Log.i(TAG, "called onCreateOptionsMenu");
mItemSwitchCamera = menu.add("Toggle Native/Java camera");
return true;
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
String toastMesage = new String();
Log.i(TAG, "called onOptionsItemSelected; selected item: " + item);
if (item == mItemSwitchCamera) {
mOpenCvCameraView.setVisibility(SurfaceView.GONE);
mIsJavaCamera = !mIsJavaCamera;
if (mIsJavaCamera) {
mOpenCvCameraView = (CameraBridgeViewBase) findViewById(R.id.tutorial1_activity_java_surface_view);
toastMesage = "Java Camera";
} else {
mOpenCvCameraView = (CameraBridgeViewBase) findViewById(R.id.tutorial1_activity_native_surface_view);
toastMesage = "Native Camera";
}
mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);
mOpenCvCameraView.setCvCameraViewListener(this);
mOpenCvCameraView.enableView();
Toast toast = Toast.makeText(this, toastMesage, Toast.LENGTH_LONG);
toast.show();
}
return true;
}
public void onCameraViewStarted(int width, int height) {
}
public void onCameraViewStopped() {
}
public Mat onCameraFrame(CvCameraViewFrame inputFrame) {
return inputFrame.rgba();
}
}
I can join any other files needed (It's probably not related to java ...)
What could be the problem ? Why does the app work but only the menu does not ?
To solve this issue, if the theme attribute for your activity in your AndroidManifest file ends in .Fullscreen, remove the .Fullscreen.
EDIT: Apparently removing the .Fullscreen doesn't work, however replacing the theme with one that does not have the .Fullscreen/.NoActionBar attributes. In this case android:theme="#style/ThemeOverlay.AppCompat.ActionBar" did the trick.
delete android:theme="#style/Theme" in the manifest file.

real time video processing with android openCV

I am a first time android programmer.
The project I am working on requires me to do (simple?) real time video processing.
The app, once finished needs to do this:
When we click on the inbuilt camera application, it opens. I then proceed to choose the video recording option. Using that I can see the surroundings without needing to record. What I am trying to accomplish is to delay that display by a few hundred milliseconds. A colleague of mine could do this pretty easily with the delay option using the laptop webcam and openCV (for computers). I am trying to accomplish the same with an android phone.
Perhaps I am doing a poor job of explaining the situation. Kindly reply at the earliest.
I am working on the code now and being a first time programmer taking some time.
Excited to start with Android programming!
no idea if this task actually needs opencv ( might be a bit of overkill ) but if you opt for that, its fairly easy.
see all we do here is record frames continuously, and toggle between realtime/playback mode on some event (onTouch for simplicity here):
package com.berak.echo;
import java.util.ArrayList;
import java.util.List;
import android.os.Bundle;
import android.view.Menu;
import android.view.MotionEvent;
import android.view.View;
import android.view.View.OnTouchListener;
import android.app.Activity;
import org.opencv.android.BaseLoaderCallback;
import org.opencv.android.CameraBridgeViewBase;
import org.opencv.android.LoaderCallbackInterface;
import org.opencv.android.OpenCVLoader;
import org.opencv.android.CameraBridgeViewBase.CvCameraViewFrame;
import org.opencv.android.CameraBridgeViewBase.CvCameraViewListener2;
import org.opencv.core.Core;
import org.opencv.core.Mat;
import org.opencv.core.Point;
import org.opencv.core.Scalar;
import com.berak.echo.R;
public class EchoActivity extends Activity implements CvCameraViewListener2, OnTouchListener {
CameraBridgeViewBase mOpenCvCameraView;
List<Mat> ring = new ArrayList<Mat>(); // recording buffer
int delay = 100; // delay == length of buffer
boolean delayed = false; // state
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_echo);
mOpenCvCameraView = (CameraBridgeViewBase) findViewById(R.id.cam3_surface_view);
mOpenCvCameraView.setCvCameraViewListener(this);
mOpenCvCameraView.setOnTouchListener(this); // setup as touchlistener
}
// lots of boilerplate, ugly, but needed.
private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
#Override
public void onManagerConnected(int status) {
switch (status) {
case LoaderCallbackInterface.SUCCESS:
mOpenCvCameraView.enableView();
break;
default:
super.onManagerConnected(status);
break;
}
}
};
#Override
public void onResume() {;
super.onResume();
OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_5,this, mLoaderCallback);
}
#Override
public void onPause() {
super.onPause();
if (mOpenCvCameraView != null)
mOpenCvCameraView.disableView();
}
#Override
public void onDestroy() {
super.onDestroy();
}
#Override
public void onCameraViewStarted(int width, int height) { }
#Override
public void onCameraViewStopped() { }
// here's the bread & butter stuff:
#Override
public Mat onCameraFrame(CvCameraViewFrame inputFrame) {
Mat mRgba = inputFrame.rgba();
ring.add(mRgba.clone()); // add one at the end
if ( ring.size() >= delay ) { // pop one from the front
ring.get(0).release();
ring.remove(0);
}
Mat ret;
String txt;
if ( delayed && ring.size()>0 ) { // depending on 'delayed' return either playback
ret = ring.get(0); // return the 'oldest'
txt = "playback";
} else {
ret = mRgba; // or realtime frame
txt = "realtime";
}
Core.putText(ret, txt, new Point(20,20), Core.FONT_HERSHEY_PLAIN, 1.2, new Scalar(200,0,0));
return ret;
}
#Override
public boolean onTouch(View v, MotionEvent event) {
// just toggle between delayed an realtime view:
delayed = ! delayed;
return false;
}
}

jWebSocket client on Android not connecting to local server

I'm trying to start a very simple jWebSocket Client on Android and connect it to my local server. I'm using the JWC class from the demo together with jWebSocket 1.0 beta 8 and Android 4.0.3, my code looks like this:
import org.jwebsocket.api.WebSocketClientEvent;
import org.jwebsocket.api.WebSocketClientTokenListener;
import org.jwebsocket.api.WebSocketPacket;
import org.jwebsocket.client.token.BaseTokenClient;
import org.jwebsocket.kit.WebSocketException;
import org.jwebsocket.token.Token;
import android.app.Activity;
import android.content.Context;
import android.content.IntentFilter;
import android.os.Bundle;
import android.view.View;
import android.widget.ArrayAdapter;
import android.widget.Button;
import android.widget.Spinner;
import cased.smids.communication.JWC;
public class TasksActivity extends Activity implements WebSocketClientTokenListener {
Spinner spinner;
Button btn_Start;
/** Called when the activity is first created. */
#Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
this.setContentView(R.layout.main);
JWC.init();
spinner = (Spinner)findViewById(R.id.sp_Task);
btn_Start = (Button)findViewById(R.id.btn_Start);
ArrayAdapter<CharSequence> adapter =
ArrayAdapter.createFromResource(
this,
R.array.Tasks,
android.R.layout.simple_spinner_item
);
adapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);
spinner.setAdapter(adapter);
btn_Start.setOnClickListener(new View.OnClickListener() {
public void onClick(View src) {
switch (src.getId()) {
case R.id.btn_Start:
if (spinner.getSelectedItem().toString().equals("Connect")) {
try {
System.out.println("connecting manually...");
JWC.open();
} catch (WebSocketException e) {
e.printStackTrace();
}
}
default:
break;
}
}
});
}
#Override
protected void onResume() {
super.onResume();
System.out.println("* opening... ");
try {
JWC.addListener(this);
JWC.open();
} catch (WebSocketException ex) {
System.out.println("* exception: " + ex.getMessage());
}
}
#Override
protected void onPause() {
System.out.println("* closing... ");
try {
JWC.close();
JWC.removeListener(this);
} catch (WebSocketException ex) {
System.out.println("* exception: " + ex.getMessage());
}
super.onPause();
}
public void processClosed(WebSocketClientEvent arg0) {
System.out.println("closed");
}
public void processOpened(WebSocketClientEvent arg0) {
System.out.println("opened");
}
public void processOpening(WebSocketClientEvent arg0) {
System.out.println("opening");
}
public void processPacket(WebSocketClientEvent arg0, WebSocketPacket arg1) {
System.out.println("packet");
}
public void processReconnecting(WebSocketClientEvent arg0) {
System.out.println("reconnecting");
}
public void processToken(WebSocketClientEvent arg0, Token arg1) {
System.out.println("token");
}
}
so basically it's just a spinner and a button. For now, all I want to do is connect to my local jWebSocketServer. The demo-app (the .apk package from the website, if I import the code eclipse tells me to remove many "#Overwrite" before it compiles the code - after that same "bug" occurs) works with my server so it has to be the code. Right now all I get is "connecting..." and about 0.1s later "closed". Every time.
btw. the app has the right INTERNET and ACCESS_NETWORK_STATE so that shouldn't be a problem.
i will be grateful for any help.
Cheers
Turns out, BaseTokenClient.open() is catching all exceptions and doing nothing about it (silent fail). In my case - NetworkOnMainThreadException. Mystery solved.

RTMP android aftek

I am trying to implement the library from http://www.aftek.com/afteklab/aftek-RTMP-library.shtml
to stream live video from a red5 server.
On the server i am using the simpleBroadcaster and i want to stream it to the android phone.
my code:
package com.cu.reader;
import java.nio.channels.FileChannel;
import java.util.Map;
import com.al.rtmp.client.RtmpClient;
import com.al.rtmp.client.RtmpStream;
import com.al.rtmp.client.RtmpStreamFactory;
import com.al.rtmp.client.data.MetaData;
import com.al.rtmp.client.data.RTMPData;
import com.al.rtmp.client.data.VideoCodec;
import com.al.rtmp.message.Metadata;
import android.app.Activity;
import android.os.Bundle;
import android.util.Log;
public class StreamreaderActivity extends Activity implements RtmpClient {
RtmpStream stream = null;
Boolean connected = false;
String server = "rtmp://216.224.181.197/oflaDemo/";
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
stream = RtmpStreamFactory.getRtmpStream();
stream.setClient(this);
stream.connect(server);
}
#Override
public void streamCreated() {
Log.i("stream","Connected!");
connected = true;
stream.setPlayName("red5StreamDemo");
stream.play();
}
#Override
public byte[] getWriteData(int length) {
// TODO Auto-generated method stub
return null;
}
#Override
public void invoke(String arg0, Object... arg1) {
// TODO Auto-generated method stub
;
}
#Override
public void onDataReceived(RTMPData rtmpData) {
MetaData metaData = rtmpData.getMetaData();
VideoCodec vc = metaData.getVideoCodec();
}
#Override
public void onError(Exception ex) {
Log.e("ClientException", " Some exception occurred." + ex.getMessage());
ex.printStackTrace();
}
#Override
public void onMetaDataReceived(Map map) {
Log.i("code","METADATA:" + map);
}
#Override
public void onResult(String method, Object... arg1) {
Log.i("result","METADATA:" + method);
}
#Override
public void onStatus(String code) {
Log.i("code",code);
}
}
i am always receiving NetStream.Play.StreamNotFound in onStatus function.
Thank you
You get NetStream.Play.StreamNotFound error becouse such stream doesnt exist on red5 application.
I made quick as3 test to check:
package {
import flash.display.Sprite;
import flash.events.AsyncErrorEvent;
import flash.events.IOErrorEvent;
import flash.events.NetStatusEvent;
import flash.media.Video;
import flash.net.NetConnection;
import flash.net.NetStream;
public class LearnWowzaClient extends Sprite {
private var nc:NetConnection;
private var video:Video = new Video();
public function LearnWowzaClient() {
nc = new NetConnection();
nc.client = this;
nc.addEventListener(NetStatusEvent.NET_STATUS, onNet);
nc.connect("rtmp://216.224.181.197/oflaDemo/");
}
private function onNet(event:NetStatusEvent):void {
trace(event);
trace(event.info.code);
switch (event.info.code) {
case "NetConnection.Connect.Success":
tryPlayStream();
break;
}
}
private function tryPlayStream():void {
trace("playStream");
var ns:NetStream = new NetStream(nc);
ns.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);
ns.addEventListener(IOErrorEvent.IO_ERROR, onIOError);
ns.addEventListener(AsyncErrorEvent.ASYNC_ERROR, onAsyncError);
ns.play("red5StreamDemo");
video.attachNetStream(ns);
}
public function onBWCheck(parameter:Object = null):void {
trace("onBWCheck p=" + parameter);
}
public function onBWDone(parameter:Object = null):void {
trace("onBWDone p=" + parameter);
}
private function onIOError(event:IOErrorEvent):void {
trace("onIOError");
}
private function onAsyncError(event:AsyncErrorEvent):void {
trace("onAsyncError");
}
private function onNetStatus(event:NetStatusEvent):void {
trace("onNetStatus ", event.info.code);
}
}
}
I also get NetStream.Play.StreamNotFound error.
Can you show red5 application code?
The stream does not exist, correct. But why? Probably one or two reasons: you have not created a live broadcast stream and 2) because you are using the wrong scope. Unless you configure it differently by hand (which is unlikely), use the 'live' broadcaster scope, which is in /live.
Thus, publish to rtmp://216.224.181.197/live/red5StreamDemo and to subscribe to the exact same mrl, in this example rtmp://216.224.181.197/live/red5StreamDemo. NOTE: for this to work, you need to create a 'live' stream and feed it to your RED5 server. You can use avconv (aka ffmpeg) to create an rtmp feed.

Categories

Resources