Endless Video Recording - android

I am building an app that will hold a 60 second buffer of recorded video. The video recording needs to run for an extended period of time (24-48 hours). I do not seem to have any issues with recording, but it appears that I lose the connection with my Camera or Surface after a few hours of time.
I am setting the max duration: MediaRecorder.setMaxDuration(10 * 1000) so that recording is broken up into 10 second segments.
Each time I restart my recording I am incrementing the output file in such a way that 6 files are reused. The purpose is to keep a 60 second history at all times.
If I stop the recording after 1-2 hours the videos play back fine. However, if I let it run for 3-4+ hours and stop the recording, the videos shows only black output with audio. In other words, it is as if it lost a connection with the camera or surface but continued recording audio.
Can anyone explain this behavior?
If nothing else, is there a better way to record X second segments of video?
FYI: I am using a wake lock so that the screen/CPU stay on.

I have sort of answered my own question here. I don't have a lot of information, but hopefully this will provide enough information to help others in a similar predicament.
First, I never posted my code because it's way too lengthy. I didn't post snippets of my code because I did not know where the problem was originating. Despite the fact that I have this working now, I still do not know what was wrong with my old code.
My suggestion:
I ended up scrapping my code and starting fresh with the code found in the Development API Demos: CameraPreview -- I found this references in at least one thread on this site.
After that I was able to continually record for hours on end without problem. So if you are in a similar situation, I would suggest using the CameraPreview code as a starting point.

Related

Recorded videos with MediaRecorder play only the first frame in Samsung devices

I have reports from several users on distinct Samsung devices (J6, S6, S7, ...), where recorded videos do not play, so seem to be corrupted.
The playback seems stuck/frozen at the first frame, while the audio plays correctly.
The issue happens with videos recorded using Android's MediaRecorder API.
The information I could gather is that it happens when a device goes into deep-sleep, so turning the screen off and perform no usage of the device for several minutes. When the devices becomes active again, then for some still unknown reasons, a new recording produces an excessively large delta duration between the first and second frame, giving the impression on playback that is frozen or has only 1 frame.
I've found around the internet the issue reported in different sites, yet no proper solution. Has anyone found a workaround? Samsung doesn’t seem to acknowledge the problem.
Further investigation have shown that the issue might be caused by a system bug in some Samsung models.
Inspecting the corrupted videos sent by some users I could confirm that in all the affected devices the first frame has an exaggerated large delta duration time.
So, with an incorrect delta time, it makes the impression that the video is frozen, when is actually just showing in screen the first frame as per its defined delta duration time, which for corrupted videos is tremendously long.
To fix these samples, I replaced the delta time of the first frame with the value from the second frame (only the first frame is affected). Then the video plays correctly as expected. I used IsoParser for this task.
But this is not a proper solution, as it means to have to check every video and repackage it if affected, as there is no way to fix it in-place. The operation requires to create a new video file, copy the contents of the original with the correct delta time, and replace the original file with the fixed one.
The proper solution would be to find out how the MediaRecorder API computes delta times, and why in some scenarios for the affected devices produces an invalid value for the first frame.
My only guess is that if by any chance the MediaRecorder implementation uses the System.nanoTime clock, then I read in some StackOverflow posts that this system clock gives sometimes a bizarre value when coming back from a device deep-sleep. If that was the real issue then the only real solution would be that Samsung fixes their implementation.

Android MediaRecorder delay and method of fixing it

I am developing an app that allows you to record many clips and stitches them back together before uploading. Whilst developing for Android 5 and up I think this is prevalent problem on all the platforms. Not sure if anyone came up with an usable solution. I seem not to be able to record 1s video as the MediaRecorder takes ~500ms to .start() and on different devices this time changes.
The only method that I found to be more or less accurate is to set up a file observer on media file at the moment when I start() and stop observing when I finish I tend to wait for the first 256bytes to be written before I measure the clip time.
Now then CAN ANYONE tell me whether this is the only thing I can do? Or is there a better method of measuring the delay? Or if you know of an approach I could take to write a recording app that will have immediate start and stop I will greatly appreciate that.
If not maybe point me in the direction of a different approach? Bear in mind I am an absolute beginner so answers like "use MediaCodec API" really not going to help me a lot. I searched for days an appropriate solution can't seem to find anything useful anywhere. Setting thread with a delayedPost is also not an answer.

Measuring Android MediaRecorder delay

Android's MediaRecorder class introduces a significant (order of magnitude a second) delay when recording video. See e.g. Delay in preparing media recorder on android.
After extensive searching, I can't find a way to get rid of this, only workarounds avoiding MediaRecorder and using onPreviewFrame instead.
In my particular application, MediaRecorder's delay wouldn't be so bad if I could measure it programatically -- assuming its standard deviation wasn't too high under everyday background load conditions. Does anyone know how to do that?
I thought of using a FileObserver on the recorded file to find out when frames start being written but, since I don't know the pipeline delays, I couldn't draw a firm conclusion from that.
On a related note, does anyone know if the 'recording alert' sound is played before the first frame is recorded? Is there a way of turning that sound off?

How can I retrieve the timestamp of a video frame as it's being recorded?

So I've been trying to figure out a way to get the timestamp of a video frame as it's being recorded.
All of the samples online and in the API documentation tell you to use MediaRecorder to record video from the camera. The problem is that no timestamp data is returned, nor is there a callback called when it records a frame.
I started investigating the Camera app in the Android 4.2 source code, and was able to make significant progress on this. I successfully recorded video from the camera, saving frame timestamps to a file when the SurfaceTexture's onFrameAvailable listener was called, since SurfaceTexture has a timestamp property.
Upon further investigation though, I figured out that I was receiving these callbacks when the frame was being displayed not when recorded. In hindsight, this makes a lot of sense and I should have spotted it earlier.
In any case, I continued further into the Camera app and started looking at the EffectsRecorder class that it has. I found that it uses some undocumented APIs via the filter framework to allow the use of OpenGL shaders. For my use case, this is helpful, so I likely will continue down this path. I'm still trying to make my way through it to get it recording video correctly, but I feel like I will still only get timestamps on display, rather than on recording.
Looking further, I see that it uses a filter that has a MediaRecorder class underneath it, using an undocumented second video source other than camera called GRALLOC_BUFFER = 2. This is even more intriguing and useful, but ultimately, I still need the timestamp for the frame being recorded.
I'm appealing to Stack Overflow in hopes that someone in the community has encountered this. I simply need a pointer of where to look. I can provide source code, since most of it is just lifted from the AOSP, but I haven't as yet simply because I'm not sure what would be relevant.
Help is greatly appreciated.

Android Audio Analysis in Real-time

I have searched for this online, but am still a bit confused (as I'm sure others will be if they think of something like this). I'd like to preface by saying that this is not for homework and/or profit.
I wanted to create an app that could listen to your microwave as you prepare popcorn. It would work by sounding an alarm when there's a certain time interval between pops (say 5-6 seconds). Again, this is simply a project to keep me occupied - not for a class.
Either way, I'm having trouble trying to figure out how to analyze the audio intake in real-time. That is, I need a way to log the time when a "pop" occurs. So that you guys don't think I didn't do any research into the matter, I've checked out this SO question and have extensively searched the AudioRecord function list.
I'm thinking that I will probably have to do something with one of the versions of read() and then compare the recorded audio every 2 seconds or so to the recorded audio of a "pop" (i.e. if 70% or more of the byte[] audioData array is the same as that of a popping sound, then log the time). Can anyone with Android audio input experience let me know if I'm at least on the right track? This is not a question of me wanting you to code anything for me, but a question as to whether I'm on the correct track, and, if not, which direction I should head instead.
I think I have an easier way.
You could use the MediaRecorder 's getMaxAmplitude method.
Anytime your recorder detects a big jump in amplitude, you have detected a corn pop!
Check out this code (ignore the playback part): Playing back sound coming from microphone in real-time
Basically the idea is that you will have to take the value of each 16-bit sample (which corresponds to the value of the wave at that time). Using the sampling rate, you can calculate the time between peaks in volume. I think that might accomplish what you want.
this may be a bit overkill, but there is a framework from MIT media labs called funf: http://code.google.com/p/funf-open-sensing-framework/
They already created classes for audio input and some analysis (FFT and the like), also saving to files or uploading is implemented as far as I've seen, and they handle most of the sensors available on the phone.
You can also get inspired from the code they wrote, which I think is pretty good.

Categories

Resources