android - save parts of streaming videos - android

I want to create an app which record video from camera, and while video recording save pieces of this streaming - length is 30 second. And when recording will stop I will have a lot of video files - each duration 30 seconds.
Is it possible to do in Android? If yes - what libraries or methods use for this task?
Thank you

Is it possible to do in Android?
Yes. it is possible
what libraries or methods use for this task?
1)MediaRecorder. it is android framework API.
2)MediaCodec + MediaMuxer. also framework APIs.
with second option you will have more control but is complex to implement. while first option is relatively straight forward. but with first option you wont be able to create video clips that is continuous. as stopping and starting a MediaRecorder isn't instantaneous.

Related

ExoPlayer playing currently recording media files

Let me refraise my question, I wrote it in a hurry.
Current situation:
I have set up a digital video recorder to record broadcasts provided via DVB-C. It is running on a raspberry 3B using TVHeadend and jetty/cling to provide UPnP and other possibilities to access media files. For watching recordings, I wrote an android player app using IJKPlayer, which runs on smartphones, FireTV and AndroidTV.
One hassle when playing media files which are currently beeing recorded is, that IJKPlayer doesn not support timeshifting. Means, when I start playing a currently recording file, I can only watch the length which is known by the player at that moment. Anything which is recorded afterwards can not be played. I need to exit the player activity and start it again. I have resolved that issue by "simulating" a completed recoding using a custom servlet implementation. Since the complete length of the recording is already known, I can use ffmpeg to accomplish this.
Future situation:
I plan to move away from IJKPlayer to ExoPlayer, because it supports hardware playback and is much faster when playing h.264 media. I can of course use the same solution like above, but as far as I have found out yet, ExoPlayer can support media files which are currently being recorded by using the Timeline class. However, I don't seem to find neither a usefull documentation nor any good example. Hence, I would appreciate any help with the timeline object.
Regards
Harry
Looks like my approach won't work. At least, I didn't find a solution. Problem is, that the server returns the stream size as it is during player-start-time. I didn't find a method to update the media duration for "regular" files.
However, I can solve the problem by changing the server side. Instead of accessing a regular file, I convert the file to m3u8 in realtime, using ffmpeg. I then throw the m3u8 URI onto the player and it updates the duration of the stream (while playing) without the need to create any additional code on the client side.

Exoplayer 2: Play video in reverse

My android app plays videos in Exoplayer 2, and now I'd like to play a video backwards.
I searched around a lot and found only the idea to convert it to a gif and this from WeiChungChang.
Is there any more straight-forward solution? Another player or a library that implements this for me is probably too much to ask, but converting it to a reverse gif gave me a lot of memory problems and I don't know what to do with the WeiChungChang idea. Playing only mp4 in reverse would be enough tho.
Videos are frequently encoded such that the encoding for a given frame is dependent on one or more frames before it, and also sometimes dependent on one or more frames after it also.
In other words to create the frame correctly you may need to refer to one or more previous and one or more subsequent frames.
This allows a video encoder reduce file or transmission size by encoding fully the information for every reference frame, sometimes called I frames, but for the frames before and/or after the reference frames only storing the delta to the reference frames.
Playing a video backwards is not a common player function and the player would typically have to decode the video as usual (i.e. forwards) to get the frames and then play them in the reverse order.
You could extend ExoPlayer to do this yourself but it may be easier to manipulate the video on the server side if possible first - there exist tools which will reverse a video and then your players will be able to play it as normal, for example https://www.videoreverser.com, https://www.kapwing.com/tools/reverse-video etc
If you need to reverse it on the device for your use case, then you could use ffmpeg on the device to achieve this - see an example ffmpeg command to do this here:
https://video.stackexchange.com/a/17739
If you are using ffmpeg it is generally easiest to use via a wrapper on Android such as this one, which will also allow you test the command before you add it to your app:
https://github.com/WritingMinds/ffmpeg-android-java
Note that video manipulation is time and processor hungry so this may be slow and consume more battery than you want on your mobile device if the video is long.

Android ExoPlayer : Does it solve gapless / seamless playback issue that is broken for the Android Media Player

Has anyone tried using ExoPlayer to achieve this?
I tried looking online with no success.
When I say gapless playback, I am referring to the problem of using the media player to play local videos back to back. After the first video is done playing, there is a noticeable delay of 1 second before the second video starts.
Hoping this question helps in understanding this issue further.
For reference please look at the following question:
Android: MediaPlayer gapless or seamless Video Playing
ExoPlayer 2, which is now officially released, seems to support gapless playback using the ConcatenatingMediaSource class. From its developer guide:
Transitions between sources are seamless. There is no requirement that the sources being concatenated are of the same format (e.g. it’s fine to concatenate a video file containing 480p H264 with one that contains 720p VP9). The sources may even be of different types (e.g. it’s fine to concatenate a video with an audio only stream).
And the example code:
MediaSource firstSource = new ExtractorMediaSource(firstVideoUri, ...);
MediaSource secondSource = new ExtractorMediaSource(secondVideoUri, ...);
// Plays the first video, then the second video.
ConcatenatingMediaSource concatenatedSource =
new ConcatenatingMediaSource(firstSource, secondSource);
EDIT: ExoPlayer 2 supports gapless playback, but as of the time of writing is still unreleased as a stable version.
You will most likely never be able to achieve perfect gapless playback of multiple tracks with ExoPlayer or Android Media Player. Neither have been written to support starting multiple tracks and I imagine it will stay out of scope for both of them.
You can achieve gapless playback by using 2 different player instances, once you have started and played the first, you can load the second and start playback once the first finishes. Using this method you could have a gapless solution, as long as you prepare the second video during playback of the first.
To take it further, you can also use 2 different surface textures for rendering the multiple videos, once the first video reaches the end you could fade out the texture and fade in the new one. Resulting in a nice seamless video effect.
Because of the nature of playing multiple videos at once you will most likely want to create your own timer for incrementing the time and deciding when to switch to the next video, rather than trying to use the callbacks from ExoPlayer or Android Media. This will allow you to keep track of the time in a more accurate fashion, without needing to keep talking to multiple video codecs.
I know this is not the answer you've been looking for, but it's the only reasonable answer. The sole way to ensure no gaps in playback is to download the entire file first and begin playback when it's done. Otherwise, in the event that you lose connectivity before the file is finished downloading, pausing is inescapable.
I just tried switching to ExoPlayer from the standard MediaPlayer implementation and the gap is the same if not worse. However I have used a very simple method of restarting the player when the status changes to ended. I don't know if there's a better proper way to do it, perhaps with 2 different ExoPlayers.

MediaPlayer -- how to separate a narration track?

I'm working on an android app that plays video (using video view). the video is meant to have both music (left and right) and narration, but I want to selectively be able to turn off the narration track in the MediaPlayer.
Is the way to do this correctly to encode by mp4 video file with 3 audio tracks (right left and narration) and then turn off the naration audio track with deselectTrack()?
Not clear to me from the documentation that MediaPlayer can handle more than 2 audio tracks.
If the audio tracks are limited to 2, would it make sense to run two media player simultaneously (synching them up with seekTo())when I want the narration track to play?
Thanks.
Sorry to burst your bubble, but...
1) You have a misunderstanding about what a "track" denotes. A track can have multiple channels (e.g., a stereo track has left and right channels). As I understand it, stereo is the extent of the Android AudioTrack implementation at present. I haven't yet checked if the OpenSL implementation is more extensive than the Java API.
2) Only 1 audio track can be selected at a time, so you wouldn't be able to have background and narration simultaneously in the way you were thinking.
3) Audio tracks can only be selected in the prepared state (i.e., not after playback has started). The documentation mentions this limitation is not ideal, so it will probably change in the future. If not for this problem, your goal could be accomplished with two audio tracks encoded in the stream, one with both background & narration, the other just background.
You will probably find it difficult to synchronize two MediaPlayers, but I haven't tried. Maybe this approach would be acceptable for your situation, although be forewarned the seekTo method isn't accurate. It depends on the encoding of the files.
Something I would try if I were you is to have two complete encoded videos, one with narration, the other without. Use two MediaPlayers and keep them both prepared. When you want to switch use seekTo to put the correct one at (or near) the desired location. That way you don't have to worry about synchronization. If the video is large, this method could use significantly more resources, though.

Best Way to combine Audio files in Android

I am developing Recording App that includes Pause/Play option.
I tried with both Media Recorder and AudioRecord
In case of AudioRecord , the recorded audio consumes larger size, so if the recording size increases say for eg: if i record 1 min audio it consumes 40 to 50MB an it really paining to combine by converting it to .raw file and send to php server.
So i tried with Media Recorder, it consumes less size,but not able to combine using the previous way handled in Audio Record.
Next step i tried with Android NDK- really paining for even Set up process.
Now my question is that which is the best way to combine recorded audio files
Using Android NDk
Reading the byte data from Audio and combining -If i use this there is problem with Headers of Recording format say amr,wav like that.
Also if i try with this , i am not able to get javax.sound package , So i tried with Plugins but no luck..
Please Suggest best way to do this. Also i tried with all this following links
Audio Link 1
Audio Link 2
Audio Link 3
Audio Link 4
Provide me Good tutorial or samples or links.Thanks.
For something like this your best bet would be to develop native C++ code using the NDK.

Categories

Resources