Working on an Android app.
I need to record video continuously.
But the video files can only be N seconds long.
I have it working so far, and the files are being created properly. However, the time it takes to stop the MediaRecorder and start it back up again with a new file handle, I lose 2 seconds between stop/start.
Is there anything procedure-wise that I can do to mitigate this delay? I'm recording in mp4 format so I'm suspecting I can't just force a "move" on a file on the filesystem since the video needs to be packaged properly using the mp4 encoding.
Anyone know, roughly, how I might go about this?
Thanks!
Related
Let me refraise my question, I wrote it in a hurry.
Current situation:
I have set up a digital video recorder to record broadcasts provided via DVB-C. It is running on a raspberry 3B using TVHeadend and jetty/cling to provide UPnP and other possibilities to access media files. For watching recordings, I wrote an android player app using IJKPlayer, which runs on smartphones, FireTV and AndroidTV.
One hassle when playing media files which are currently beeing recorded is, that IJKPlayer doesn not support timeshifting. Means, when I start playing a currently recording file, I can only watch the length which is known by the player at that moment. Anything which is recorded afterwards can not be played. I need to exit the player activity and start it again. I have resolved that issue by "simulating" a completed recoding using a custom servlet implementation. Since the complete length of the recording is already known, I can use ffmpeg to accomplish this.
Future situation:
I plan to move away from IJKPlayer to ExoPlayer, because it supports hardware playback and is much faster when playing h.264 media. I can of course use the same solution like above, but as far as I have found out yet, ExoPlayer can support media files which are currently being recorded by using the Timeline class. However, I don't seem to find neither a usefull documentation nor any good example. Hence, I would appreciate any help with the timeline object.
Regards
Harry
Looks like my approach won't work. At least, I didn't find a solution. Problem is, that the server returns the stream size as it is during player-start-time. I didn't find a method to update the media duration for "regular" files.
However, I can solve the problem by changing the server side. Instead of accessing a regular file, I convert the file to m3u8 in realtime, using ffmpeg. I then throw the m3u8 URI onto the player and it updates the duration of the stream (while playing) without the need to create any additional code on the client side.
I am creating a video using an Android app and after this mp4 Video File is created it does not play back properly. The audio will play over a still frame from the video and then once the timer reaches the end then the video will play several moving frames.
This issue is only occurring when I create the video on a Samsung Galaxy S7 and not on any other phones.
I am not experienced in video file encoding so I do not even know where to start with debugging what is wrong with the file. If someone could explain what causes something like this That would be amazing
The first video sample decode time in your file is 1506981408/90000 - which is giant - about 4.5 hours into the stream.
So the entry is obviously bogus.
Hard to say where is bogus decode time is coming from - may uninitialized memory of some sort.
See 'stts' box offset 1052223 - first array entry.
I corrected your video and put a copy here: https://drive.google.com/open?id=0B1K1m-YmE28DMXdFemZKbXg0WFk
I've been exploring the documentation and examples at http://bigflake.com/mediacodec/ by Fadden, and applied patch http://bigflake.com/mediacodec/0001-Record-game-into-.mp4.patch to the breakout game. Unfortunately, after compiling the code, I realized it doesn't work, producing video files that aren't streamable.
I see the following error:
"The mp4 file will not be streamable."
According to Fadden, this should be fixed by checking the mBufferInfo.flags (https://stackoverflow.com/questions/23934087/non-streamable-video-file-created-with-mediamuxer), which is already done in his code, so I'm at a complete loss. Did anyone else get the video recording patch to work?
The warning you're seeing is just a warning, nothing more. MP4 files aren't streamable anyway in most cases, in the sense that you would be able to pass the written MP4 over a pipe and have the other end play it back (unless you resort to a lot of extra trickery, or use fragmented MP4 which the android MP4 muxer doesn't write normally). What streamable means here is that once you have the final MP4 file, you can start playing it back without having to seek to the end of the file (which playback over HTTP can do e.g. with HTTP byte range requests).
To write a streamable MP4, the muxer tries to guess how large your file will be, and reserves a correspondingly large area at the start of the file to write the file index to. If the file turns out to be larger so the index doesn't fit into the reserved area, it needs to be written at the end of the file. See lines 506-519 in https://android.googlesource.com/platform/frameworks/av/+/lollipop-release/media/libstagefright/MPEG4Writer.cpp for more info about this guess. Basically the guess seems to boil down to: "The default MAX _MOOV_BOX_SIZE value is based on about 3 minute video recording with a bit rate about 3 Mbps, because statistics also show that most of the video captured are going to be less than 3 minutes."
If you want to turn such a non-streamable MP4 file into a streamable one, you can use the qt-faststart tool from libav/ffmpeg, which just reorders the blocks in the file.
You can check Intel INDE Media for Mobile, it allows to make game capturing and streaming to network:
https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials
simplest capturing:
https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials-video-capturing-for-opengl-applications
youtube streaming:
https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials-video-streaming-from-device-to-youtube
I'm trying to create an app to stream live TV. Currently the problem I'm facing is that after say 10 minutes of playing, the video will freeze but the audio will carry on. This is on a 1.3mbps stream. I also have lower streams, such as a 384kbps stream, that might last an hour or so, but will still do the same. I've tested this with a local video, that is high quality (file size is 2.3gb) and that has no lag and doesn't freeze at all, so it must be something to do with the way HLS is streamed to android.
Does anyone have any idea on how to solve this problem?
Thanks
I'm creating an app where I need to load OGG audio files into a SoundPool, but it must be compatible with Android 2.1 (which does NOT support onLoadingCompleteSetListener). Because of this, there's no way to tell if the sound file is loaded before playing it.
To bypass this, I put a Thread.sleep(1000); right after loading the audio file to give it some time to load. But now without the onLoadingCompleteSetListener method, my audio files are extremely static-y and unclear. It sounds horrible. But I'm not getting the "sample # not ready!" warning in LogCat so I think the Thread.sleep(1000); is doing its job.
But does Thread.sleep(1000); also stop the loading process? So actually I'm not giving it time to load and that's why it's static-y? I can't figure this out, I just need my audio to be clearer. Any suggestions could help.
Thanks!
Solved myself, used Audacity to normalize and compress the audio files after learning about it from the answer in another one of my questions: Audio Sound Too Low in Android App
Fixed all the static