Actually I need to get the specific time frame images from video using Mediameatdatareteriver
but I'm not interested in using FFMPEG.
If someone have any idea about thisplease help me guys.
Try this code (extracted from Android's android.media.ThumbnailUtils class).
Bitmap bitmap = null;
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
try {
retriever.setDataSource(filePath);
bitmap = retriever.getFrameAtTime(time);
} catch (IllegalArgumentException ex) {
// Assume this is a corrupt video file
} catch (RuntimeException ex) {
// Assume this is a corrupt video file.
} finally {
try {
retriever.release();
} catch (RuntimeException ex) {
// Ignore failures while cleaning up.
}
}
Be mindful that the time value is in microseconds.
EDIT These is another overload for getFrameAtTime() where you can pass in options so that the selected frame is closer to the specified time (though it may not be a key frame). For example, use:
bitmap = retriever.getFrameAtTime(time, MediaMetadataRetriever. OPTION_CLOSEST);
The documentation states that the performance cost of this option is greater though, so be mindful of that.
Related
I'm trying to use the SensorDirectChannel added in Andriod 8. With the memory file as shared medium it is all zero. Using the hardware buffer the returned array is of size zero.
I initiate the SensorDirectChannel like this:
if (mSensor.isDirectChannelTypeSupported(SensorDirectChannel.TYPE_HARDWARE_BUFFER)) {
try {
hawBuff = HardwareBuffer.create(1040, 1, HardwareBuffer.BLOB, 1, HardwareBuffer.USAGE_SENSOR_DIRECT_DATA);
mSensorDirectChannelBuff = mSensorManager.createDirectChannel(hawBuff);
mSensorDirectChannelBuff.configure(mSensor, SensorDirectChannel.RATE_FAST);
} catch (Exception e) {
e.printStackTrace();
}
}
Then I try to read from the hardware buffer like this:
if (mSensorDirectChannelBuff.isOpen()) {
Parcel measurement = Parcel.obtain();
hawBuff.writeToParcel(measurement, Parcelable.PARCELABLE_WRITE_RETURN_VALUE);
int[] measurementArray = measurement.createIntArray();
try {
Log.d("SensorDirectChannel", "HardwareBuffer: " + measurementArray[0]);
} catch (Exception e) {
Log.e("HardwareBuffer", "array is empty");
e.printStackTrace();
}
}
The array is always of size zero.
I don't know if I missed something in the docs or what I'm doing wrong.
Does someone have an idea what's wrong?
Now, after some software updates, the smartphone does no longer support sensor direct channels. The line mSensor.isDirectChannelTypeSupported(SensorDirectChannel.TYPE_HARDWARE_BUFFER) now returns false. The same is the case for the memory file option.
So I guess it never worked from the start, causing my problems trying to use it.
Mostly I want to know if there is a fundamental conflict that I can't share the same resource with the library, if so, I will need to take a different approach.
My goal is to have low quality video with the detector's meta data saved at the same time, so that I can do some post processing and slicing without much of a delay.
Based on the CameraDetectorDemo - camera detector
I have been initializing a MediaRecorder, but it saves a black screen if I start it before the detector, and it crashes on start (with code -19) if I start it after the detector. The detector is attaching the preview, maybe it is to do with that.
I added some buttons to control these functions:
protected void cameraInit() {
String state = Environment.getExternalStorageState();
if (!Environment.MEDIA_MOUNTED.equals(state)) {
Log.d(LOG_TAG, "Drive not mounted - cannot write video");
return;
}
File file = new File(getExternalFilesDir(Environment.DIRECTORY_MOVIES), "demo.gp3");
Log.d(LOG_TAG, String.format("Camera Initializing. Setting output to: %s", file.getAbsolutePath()));
// Set sources
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
// Set profile
recorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_LOW));
// Set output profile
recorder.setOutputFile(file.getAbsolutePath());
// Set preview output
recorder.setPreviewDisplay(cameraPreview.getHolder().getSurface());
try {
this.recorder.prepare();
} catch (IOException e) {
Log.e(LOG_TAG, "IO exception on camera Initialization");
e.printStackTrace();
} catch (IllegalStateException e) {
// This is thrown if the previous calls are not called with the
// proper order
Log.e(LOG_TAG, "Failed to initialize things properly :( ");
e.printStackTrace();
}
}
protected void cameraStart() {
Log.d(LOG_TAG, "Camera Start");
this.recorder.start();
}
protected void cameraStop() {
Log.d(LOG_TAG, "Camera Stop");
this.recorder.stop();
}
The Affdex SDK's CameraDetector needs access to the camera to get its preview frames and process them, so that's not going to work if the MediaRecorder has control of the camera.
Probably your best bet is to take preview frames from the camera, feed them to an Affdex FrameDetector for processing, and also save them to a video file via a MediaCodec and MediaMuxer, although I haven't tried that.
I'm using ffmpeg in my Android application and sometimes I'm getting out of memory error, I'm calling the ffmpeg inside a HandlerThread, is it ok to catch out of memory error and exit the thread while the main thread keeps on running?
I read a lot of this being not a good practice, the thing is that I really need that because I have to edit the DB when there is any kind of error
fc = new FfmpegController(context, fileTmp);
try {
fc.processVideo(clip_in, clip_out, false,
new ShellUtils.ShellCallback() {
#Override
public void shellOut(String shellLine) {
}
#Override
public void processComplete(int exitValue) {
//Update the DB
}
});
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
} catch (InterruptedException e) {
} catch (Exception e) {
}catch (OutOfMemoryError e) {
//update the DB
}
No something is going wrong if you are getting OutOfMemory errors. I would look into buffering your audio, as likely you are running the whole clip through ffmpeg at once, which is going to use up alot of memory.
Also, keep in mind that lots of us doing Audio in Android end up using the NDK primarily because of issues like you are experiencing. Audio has to be really high performance, and using the NDK allows you to write more low level memory efficient audio handling.
Android's AudioTrack has a write method that allows you to push an Audio buffer to it. A warning that this is not entry level and requires some knowledge of AudioBuffer's as well as requires you to read buffers in, send them to ffmpeg and then pass to AudioTrack. Not easy to do, and unfortunately more advanced audio on Android is not easy.
I am currently writing an android app that logs the accelerometer. (its a test app at the moment so i can prototype an algorithm.
To write out a list of SensorEventStore's (which is just a way of storing the data from a SensorEvent) to the SD card from a 30 minute recording, locks up the GUI for about 20 - 30 seconds while writing the file.
I am using the following code to write out the file to the SD card.
#Override
public void onMessage(Messages message, Object param[]) {
if(message == IDigest.Messages.SaveData) {
File folder = (File) param[0];
File accFileAll = new File(folder, startTime + "_all.acc");
FileWriter accFileWriterAll;
try {
accFileWriterAll = new FileWriter(accFileAll);
} catch (IOException e) {
accFileWriterAll = null;
}
for(Iterator<SensorEventStore> i=eventList.iterator(); i.hasNext();) {
SensorEventStore e = i.next();
if(accFileWriterAll != null) {
try {
accFileWriterAll.write(
String.format(
"%d,%d,%f,%f,%f\r\n",
e.timestamp,
e.accuracy,
e.values[0],
e.values[1],
e.values[2]
)
);
accFileWriterAll.flush();
} catch (IOException ex) {
}
}
}
new SingleMediaScanner(RunBuddyApplication.Context, accFileAll);
}
}
Can anyone give me any pointers to make this not lock up the UI, or not have to take the amount of time it currently takes to write out the file.
Firstly you should try to do this in the background. The AsyncTask is fairly well suited for the task.
Other than that, you should remove the flush() statement, and probperly close() your file writer. The flush causes the data to be written to disk in rather small portions, which is really slow. If you leave the filewriter to its own flushing, it will determine a buffer size on its own. When you properly close the FileWriter, the remaining data should be written to disk as well.
Also, you could take a look at "Try with resources" for your filewriter, but that is optional.
Why doesn't the MediaPlayer show the video as soon as it is available. What I mean is on the IPhone when a video is played the video shows up right away. Even when returning from pause. But on the Android the screen stays black for a couple of milliseconds to a second depending on the device used and how many processes are running in the background.
I'm asking this because i want to use one of the beginning frames from my video play as a type of screenshot and currently I'm using a handler to wait 1 second before pausing the video.
Can someone tell me a quick way to make the video show up as soon as it is started or even prepared instead of my workaround?
EDIT:
Here is how I prepare my video player so It should be prepared right.
private void initVideo()
{
Log.i("VideoPlayer", "Initialize Video File" + videoFileName);
AssetFileDescriptor afd;
try {
if(videoFileName != null);
{
afd = getAssets().openFd(videoFileName);
vidplayer = new MediaPlayer();
vidplayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getDeclaredLength());
vidplayer.setDisplay(holder);
vidplayer.prepare();
vidplayer.setOnCompletionListener(this);
vidplayer.setOnPreparedListener(this);
//Log.i("INITVIDEO", Integer.toString(videoPausedAt));
vidplayer.seekTo(videoPausedAt);
//Log.i("VideoPlayer", "video Prepared");
videoDuration = vidplayer.getDuration()/1000;
isVideoReady = true;
}
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} catch (Exception e)
{
//Log.i("InitPlayer", e.getClass().toString());
e.printStackTrace();
}
}
For the background, you can get a thumbnail of the video:
private Bitmap getThumbnail(String path){
try{
return ThumbnailUtils.createVideoThumbnail(path, MediaStore.Images.Thumbnails.MINI_KIND);
}catch(Exception e){
return null;
}
}
When the video starts, you'll need to set the background back to null or you won't be able to see the video.
As for it not playing right away, it should play as soon as start() is called if you prepared it correctly, but it could be delayed if it has to load data let's say from a stream over the internet.
I have found that it is the phones fault.(mostly) Video's will show up automatically unless phone is bogged down with apps and thus loading of the video takes longer (noticed after having a voip service running).