Out of memory - Android - android

I'm using ffmpeg in my Android application and sometimes I'm getting out of memory error, I'm calling the ffmpeg inside a HandlerThread, is it ok to catch out of memory error and exit the thread while the main thread keeps on running?
I read a lot of this being not a good practice, the thing is that I really need that because I have to edit the DB when there is any kind of error
fc = new FfmpegController(context, fileTmp);
try {
fc.processVideo(clip_in, clip_out, false,
new ShellUtils.ShellCallback() {
#Override
public void shellOut(String shellLine) {
}
#Override
public void processComplete(int exitValue) {
//Update the DB
}
});
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
} catch (InterruptedException e) {
} catch (Exception e) {
}catch (OutOfMemoryError e) {
//update the DB
}

No something is going wrong if you are getting OutOfMemory errors. I would look into buffering your audio, as likely you are running the whole clip through ffmpeg at once, which is going to use up alot of memory.
Also, keep in mind that lots of us doing Audio in Android end up using the NDK primarily because of issues like you are experiencing. Audio has to be really high performance, and using the NDK allows you to write more low level memory efficient audio handling.
Android's AudioTrack has a write method that allows you to push an Audio buffer to it. A warning that this is not entry level and requires some knowledge of AudioBuffer's as well as requires you to read buffers in, send them to ffmpeg and then pass to AudioTrack. Not easy to do, and unfortunately more advanced audio on Android is not easy.

Related

Affdex Android SDK - Save and use CameraDetector

Mostly I want to know if there is a fundamental conflict that I can't share the same resource with the library, if so, I will need to take a different approach.
My goal is to have low quality video with the detector's meta data saved at the same time, so that I can do some post processing and slicing without much of a delay.
Based on the CameraDetectorDemo - camera detector
I have been initializing a MediaRecorder, but it saves a black screen if I start it before the detector, and it crashes on start (with code -19) if I start it after the detector. The detector is attaching the preview, maybe it is to do with that.
I added some buttons to control these functions:
protected void cameraInit() {
String state = Environment.getExternalStorageState();
if (!Environment.MEDIA_MOUNTED.equals(state)) {
Log.d(LOG_TAG, "Drive not mounted - cannot write video");
return;
}
File file = new File(getExternalFilesDir(Environment.DIRECTORY_MOVIES), "demo.gp3");
Log.d(LOG_TAG, String.format("Camera Initializing. Setting output to: %s", file.getAbsolutePath()));
// Set sources
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
// Set profile
recorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_LOW));
// Set output profile
recorder.setOutputFile(file.getAbsolutePath());
// Set preview output
recorder.setPreviewDisplay(cameraPreview.getHolder().getSurface());
try {
this.recorder.prepare();
} catch (IOException e) {
Log.e(LOG_TAG, "IO exception on camera Initialization");
e.printStackTrace();
} catch (IllegalStateException e) {
// This is thrown if the previous calls are not called with the
// proper order
Log.e(LOG_TAG, "Failed to initialize things properly :( ");
e.printStackTrace();
}
}
protected void cameraStart() {
Log.d(LOG_TAG, "Camera Start");
this.recorder.start();
}
protected void cameraStop() {
Log.d(LOG_TAG, "Camera Stop");
this.recorder.stop();
}
The Affdex SDK's CameraDetector needs access to the camera to get its preview frames and process them, so that's not going to work if the MediaRecorder has control of the camera.
Probably your best bet is to take preview frames from the camera, feed them to an Affdex FrameDetector for processing, and also save them to a video file via a MediaCodec and MediaMuxer, although I haven't tried that.

Is reflection useful for Android?

I learnt a bit about reflection after reading about it in some tpics here. From what I understands, it is used to check the avaibility of a certain class/method/field at runtime. But is it really useful in Android ? Android provide us with the api version at runtime and we can know if a particular class/method or field is available by reading the Android doc (or with error message with Android Studio).
I understand how it can be useful with Java in general, but is there any meaning to use it in Android?
Reflection (in every languages) is very powerful.
In Android most of time reflection is not needed, because you can find Security Exceptions, problems. It depends on what You do.
If you use undocumented classes, libs, you can use it, and it's very useful.
Sometimes, to do particular things, like turn on/off 3g on old device, change device language, you need rooted device to use reflection.
Finally, depends always on what You do.
Sometimes it works , and some times it does't work .
E.T work example :
You can reflect the method to hang off a phone call (there are a lot example codes on Internet so I won't copy the code.).
Doesn't work example:
If you want to switch data connect status , use reflection works on 4.4 but will not work on 5.0 because it's a binder connection, the BN will check Permission the app granted , but this permission only granted to system app . So if your app is a third part app,on 5.0 you can't use reflection to switch data connect status.
Hope that helps
This is a very general question, it really depends on what you're trying to do. Sometimes you have to use reflection, if the APIs are hidden, all depends on your use case, generally you should avoid reflection as it complicates your code more than its needs to be and its potentially unsafe for further versions of android.
In my opinion it's a good to way to do particular things.
For example you can use the methods of PowerProfile class to do a simple power model for your phone.
By the method getAveragePower(POWER_WIFI_SCAN) you can take the average current in mA consumed by the subsystem (in this case: wi-fi during scan).
So to use PowerProfile's methods for get your battery capacity you you could use java reflection in this way:
private Object mPowerProfile_;
private static final String POWER_PROFILE_CLASS = "com.android.internal.os.PowerProfile";
private Double batteryCapacity = Double.valueOf(1);
public Double getBatteryCapacity(Context ctx) {
try {
mPowerProfile_ = Class.forName(POWER_PROFILE_CLASS).getConstructor(Context.class).newInstance(this);
} catch (InstantiationException e) {
e.printStackTrace();
} catch (IllegalAccessException e) {
e.printStackTrace();
} catch (InvocationTargetException e) {
e.printStackTrace();
} catch (NoSuchMethodException e) {
e.printStackTrace();
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
try {
batteryCapacity = (Double) Class.forName(POWER_PROFILE_CLASS).getMethod("getAveragePower", String.class).invoke(mPowerProfile_, "battery.capacity");
} catch (IllegalAccessException e) {
e.printStackTrace();
} catch (InvocationTargetException e) {
e.printStackTrace();
} catch (NoSuchMethodException e) {
e.printStackTrace();
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
}

Android rtsp url not loading

I'm trying to play an rtsp stream using MediaPlayer in android and the application seems to always become stuck on MediaPlayer.prepare();
The url is valid as I tested it using VLC on my desktop.
Any ideas why the application is not preparing the stream.
class InitializeService extends Thread {
#Override
public void run() {
try {
player.prepare();
Log.d("Play", "Player prepared");
} catch (IOException e) {
e.printStackTrace();
fallback();
} catch (IllegalStateException e) {
e.printStackTrace();
fallback();
}
}
}
The log statement is never reached.
Update 1:
Sorry I forgot to mention that the stream will always be in 3gp format. Here is a url rtsp://r2---sn-p5qlsu76.c.youtube.com/CiILENy73wIaGQnTXOVs7Kwo8xMYESARFEgGUgZ2aWRlb3MM/0/0/0/video.3gp
Your stream might not be of a format supported by Android.
Check http://developer.android.com/guide/appendix/media-formats.html to see if Android supports it.
Turns out it was android l that wasn't able to play the streams.

Android video encoding with fr and resolution manipulation

I want to be able to take a video recorded with an Android device and encode it to a new Resolution and Frame Rate using my app. The purpose is to upload a much smaller version of the original video (in size), since this will be videos 30 min long or more.
So far, I've read of people saying FFmpeg is they way to go. However, the documentation seems to be lacking.
I have also considered using http opencv http://opencv.org/platforms/android.html
Considering I need to manipulate the video resolution and frame rate, which tool do you think can do such things better? Are there any other technologies to consider?
An important question is, since this will be long videos, is it reasonable to do the encoding in an android device (Consider power resources, time, etc.)
Thanks in advance!
I decided to use ffmpeg to tackle this project. After much researching and trials, I was not able to build ffmpeg for library (using Ubuntu 14.04 LTS.)
However, I used this excellent library https://github.com/guardianproject/android-ffmpeg-java
I just created a project and added that library and it works like a charm. No need to build your own files or mess with the Android NDK. Of course you would still need to build the library yourself if you want to customize it. But it has everything I need.
Here is an example of how I used to lower a video resolution and change the frame rate:
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
// input source
final Clip clip_in = new Clip("/storage/emulated/0/Developer/test.mp4");
Activity activity = (Activity) MainActivity.this;
File fileTmp = activity.getCacheDir();
File fileAppRoot = new File(activity.getApplicationInfo().dataDir);
final Clip clip_out = new Clip("/storage/emulated/0/Developer/result2.mp4");
//put flags in clip
clip_out.videoFps = "30";
clip_out.width = 480;
clip_out.height = 320;
clip_out.videoCodec = "libx264";
clip_out.audioCodec = "copy";
try {
FfmpegController fc = new FfmpegController(fileTmp, fileAppRoot);
fc.processVideo(clip_in, clip_out, false, new ShellUtils.ShellCallback() {
#Override
public void shellOut(String shellLine) {
System.out.println("MIX> " + shellLine);
}
#Override
public void processComplete(int exitValue) {
if (exitValue != 0) {
System.err.println("concat non-zero exit: " + exitValue);
Log.d("ffmpeg","Compilation error. FFmpeg failed");
Toast.makeText(MainActivity.this, "result: ffmpeg failed", Toast.LENGTH_LONG).show();
} else {
if(new File( "/storage/emulated/0/Developer/result2.mp4").exists()) {
Log.d("ffmpeg","Success file:"+ "/storage/emulated/0/Developer/result2.mp4");
}
}
}
});
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
// automated try and catch
setContentView(R.layout.activity_main);
}
}
The function processVideo produces a command similar to ffmpeg -i input -s 480X320 -r 30 -vcodec libx264 -acodec copy output
This a very simple example, but it outputted the same kind of conversion done by ffmpeg desktop. This codes needs lots of work! I hope it helps anyone.

How to get specific time frame images from video files in Android?

Actually I need to get the specific time frame images from video using Mediameatdatareteriver
but I'm not interested in using FFMPEG.
If someone have any idea about thisplease help me guys.
Try this code (extracted from Android's android.media.ThumbnailUtils class).
Bitmap bitmap = null;
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
try {
retriever.setDataSource(filePath);
bitmap = retriever.getFrameAtTime(time);
} catch (IllegalArgumentException ex) {
// Assume this is a corrupt video file
} catch (RuntimeException ex) {
// Assume this is a corrupt video file.
} finally {
try {
retriever.release();
} catch (RuntimeException ex) {
// Ignore failures while cleaning up.
}
}
Be mindful that the time value is in microseconds.
EDIT These is another overload for getFrameAtTime() where you can pass in options so that the selected frame is closer to the specified time (though it may not be a key frame). For example, use:
bitmap = retriever.getFrameAtTime(time, MediaMetadataRetriever. OPTION_CLOSEST);
The documentation states that the performance cost of this option is greater though, so be mindful of that.

Categories

Resources