I have been trying to capture the screen from Android Emulator and record it to a .mp4 file. I adopted the standard approach of creating a virtual display and routing the frames to an encoder, multiplexing the video channel and writing to the an external storage. However, the ouput .mp4 file is just a blank screen when played back. The same code works when run on a device.
One observation is that the BufferInfo.size from onOutputBufferAvailable() always has a constant value of 13 or 2718 which clearly indicates trouble with the MediaCodec encoder. Should I change some parameters when configuring the encoder?
Another observation is from Logcat that tells me that a SoftAVCEncoder is used when running in an emulator, which kinda indicates that some software encoding is used but still not sure why this does not work.
Related
I hope this message finds you well. We are writing to ask some help on Android 10 API and its new functionalities. We are trying to build an application that captures only the important parts of videos and audios without saving the whole thing.
To do that our first objective is to capture playback audio in Android 10. We have tried the following two methods:
1st Method:
We have tried to capture audio on Youtube using Playback capture API introduced for android 10 but the resultant audio file produced only silence. In the documentation, it was written that when one wants to capture an audio file on a third party application, one can use "allowAudioplaybackCapture = true" in the manifest. We have already used this in manifest but it did not work.
We tried this method on YouTube first. It’s very likely that YouTube prevents capturing audio playback so we tried on a local audio file. The result was the same as before.
2nd Method:
We tried to record internal audio capture with media projection which is allowed for android 5.0 to android 10 (It works fine). The problem is that it captures internal audio along with external audio i.e microphone data. When we tried to mute the external audio capture, it also muted the internal capture.
Please see the code block below to have a better idea:
https://pastebin.pl/view/b2f4ec78
We would be grateful if you can give us some pointers. We have tried every documentation that is available but we could not find any solution.
I am creating a video using an Android app and after this mp4 Video File is created it does not play back properly. The audio will play over a still frame from the video and then once the timer reaches the end then the video will play several moving frames.
This issue is only occurring when I create the video on a Samsung Galaxy S7 and not on any other phones.
I am not experienced in video file encoding so I do not even know where to start with debugging what is wrong with the file. If someone could explain what causes something like this That would be amazing
The first video sample decode time in your file is 1506981408/90000 - which is giant - about 4.5 hours into the stream.
So the entry is obviously bogus.
Hard to say where is bogus decode time is coming from - may uninitialized memory of some sort.
See 'stts' box offset 1052223 - first array entry.
I corrected your video and put a copy here: https://drive.google.com/open?id=0B1K1m-YmE28DMXdFemZKbXg0WFk
when I watched videos on my mobile phone I never noticed any image artefacts. Most of the time I was connected to a Wi-Fi and the streaming protocol was HLS. On error the video just stopped, crashed or showed me a load indicator.
That’s why I would ask on which faults you can see image artefacts? If the streaming protocol or the device detect image artefacts in error cases? And which image artefacts you guys has seen.
Thanks!
Image artefacts are caused by errors in the video stream. The errors are a result of bad input stream, errors made by the encoder, errors during transfer and so on. Some errors are fatal, from others the decoder can recover - this depends on the decoder itself. IMHO android video capabilities are very poor.
I've been exploring the documentation and examples at http://bigflake.com/mediacodec/ by Fadden, and applied patch http://bigflake.com/mediacodec/0001-Record-game-into-.mp4.patch to the breakout game. Unfortunately, after compiling the code, I realized it doesn't work, producing video files that aren't streamable.
I see the following error:
"The mp4 file will not be streamable."
According to Fadden, this should be fixed by checking the mBufferInfo.flags (https://stackoverflow.com/questions/23934087/non-streamable-video-file-created-with-mediamuxer), which is already done in his code, so I'm at a complete loss. Did anyone else get the video recording patch to work?
The warning you're seeing is just a warning, nothing more. MP4 files aren't streamable anyway in most cases, in the sense that you would be able to pass the written MP4 over a pipe and have the other end play it back (unless you resort to a lot of extra trickery, or use fragmented MP4 which the android MP4 muxer doesn't write normally). What streamable means here is that once you have the final MP4 file, you can start playing it back without having to seek to the end of the file (which playback over HTTP can do e.g. with HTTP byte range requests).
To write a streamable MP4, the muxer tries to guess how large your file will be, and reserves a correspondingly large area at the start of the file to write the file index to. If the file turns out to be larger so the index doesn't fit into the reserved area, it needs to be written at the end of the file. See lines 506-519 in https://android.googlesource.com/platform/frameworks/av/+/lollipop-release/media/libstagefright/MPEG4Writer.cpp for more info about this guess. Basically the guess seems to boil down to: "The default MAX _MOOV_BOX_SIZE value is based on about 3 minute video recording with a bit rate about 3 Mbps, because statistics also show that most of the video captured are going to be less than 3 minutes."
If you want to turn such a non-streamable MP4 file into a streamable one, you can use the qt-faststart tool from libav/ffmpeg, which just reorders the blocks in the file.
You can check Intel INDE Media for Mobile, it allows to make game capturing and streaming to network:
https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials
simplest capturing:
https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials-video-capturing-for-opengl-applications
youtube streaming:
https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials-video-streaming-from-device-to-youtube
I have two Samsung Galaxy S2, same provider (H3G Italy), same firmware (2.3.5) bought together in same shop.
I develop an app for call recording, using mediarecorder and using as audiosource VOICE_CALL, this work well on one device and not on second.
During the debug there is no errors in Log, just the app stuck.
The Mediarecorder.start is called on PhoneStateListener status change, I tried with all audio format available in mediarecorder but no success.
If in mediarecorder.setaudiosource I add also VOICE_UPLINK then the app do not freeze anymore, but the audio quality is too low.
How is possible that two same mobile, same code and same development PC, one work fantastic and an other not?
You will get the original .wav file by using AudioRecorder.
And then you need to encode it into other format of audio files you want.
For example this project provides a method to use lame libs to encode the .wav file to MP3.
By setting up the output sampling rate bit you can get the mp3 file with the size you want.