I hope this message finds you well. We are writing to ask some help on Android 10 API and its new functionalities. We are trying to build an application that captures only the important parts of videos and audios without saving the whole thing.
To do that our first objective is to capture playback audio in Android 10. We have tried the following two methods:
1st Method:
We have tried to capture audio on Youtube using Playback capture API introduced for android 10 but the resultant audio file produced only silence. In the documentation, it was written that when one wants to capture an audio file on a third party application, one can use "allowAudioplaybackCapture = true" in the manifest. We have already used this in manifest but it did not work.
We tried this method on YouTube first. It’s very likely that YouTube prevents capturing audio playback so we tried on a local audio file. The result was the same as before.
2nd Method:
We tried to record internal audio capture with media projection which is allowed for android 5.0 to android 10 (It works fine). The problem is that it captures internal audio along with external audio i.e microphone data. When we tried to mute the external audio capture, it also muted the internal capture.
Please see the code block below to have a better idea:
https://pastebin.pl/view/b2f4ec78
We would be grateful if you can give us some pointers. We have tried every documentation that is available but we could not find any solution.
Related
Let me refraise my question, I wrote it in a hurry.
Current situation:
I have set up a digital video recorder to record broadcasts provided via DVB-C. It is running on a raspberry 3B using TVHeadend and jetty/cling to provide UPnP and other possibilities to access media files. For watching recordings, I wrote an android player app using IJKPlayer, which runs on smartphones, FireTV and AndroidTV.
One hassle when playing media files which are currently beeing recorded is, that IJKPlayer doesn not support timeshifting. Means, when I start playing a currently recording file, I can only watch the length which is known by the player at that moment. Anything which is recorded afterwards can not be played. I need to exit the player activity and start it again. I have resolved that issue by "simulating" a completed recoding using a custom servlet implementation. Since the complete length of the recording is already known, I can use ffmpeg to accomplish this.
Future situation:
I plan to move away from IJKPlayer to ExoPlayer, because it supports hardware playback and is much faster when playing h.264 media. I can of course use the same solution like above, but as far as I have found out yet, ExoPlayer can support media files which are currently being recorded by using the Timeline class. However, I don't seem to find neither a usefull documentation nor any good example. Hence, I would appreciate any help with the timeline object.
Regards
Harry
Looks like my approach won't work. At least, I didn't find a solution. Problem is, that the server returns the stream size as it is during player-start-time. I didn't find a method to update the media duration for "regular" files.
However, I can solve the problem by changing the server side. Instead of accessing a regular file, I convert the file to m3u8 in realtime, using ffmpeg. I then throw the m3u8 URI onto the player and it updates the duration of the stream (while playing) without the need to create any additional code on the client side.
I want to capture internal audio, programmatically. For instance, I want to play an audio file (on my device) and then capture the audio output - not using the microphone with the volume turned up.
Example use case: I want to make a library that other developers can use in their games, allowing users to record in-game audio.
Android says there is Playback Capture functionality, however, I just cannot find an example. I've googled for hours. Whenever I google 'android capture app sounds' or 'android record audio', I either get links to voice recording apps or some code recording from the microphone (usually nearly 10 years old)... e.g
Does anyone have a link to a working example?
You can reference MediareCorder with https://developer.android.com/guide/topics/media/mediarecorder
I want to implement screencast feature in my Android app, i.e. Recording the audio and the screen video at the same time and converting it to a .mp4 file.
I got to know that there are some Media Codecs inbuilt in the Android SDK, but the issue with those ones is that I will need to record video and the audio separately and then stitch both of the elements together to create a complete video.
I want to know if a library exists which can directly record both the Audio and Video without the need to stitch them later on.
Thanks in advance!
this project can help you get started here
Hello guys I have a question
I have to admit before I ask my question I never used Android Sdk before but I have coded java for couple of years.
I have a fm radio app.It's an internet radio and I want to record it's output. Is it possible to use an external app to record some other app's output? And if yes, It also has some pre recorded shows which you can listen within the app. They do not get saved into my device when I listen however is it possible to download those shows? Like finding source of the audio and downloading it using my external app.
I'm pretty sure that the recorded shows are downloaded from the internet. I know some audio grabbers as browser extensions in Pc. So I'm asking, if such thing is possible in Android as well.
See below:
https://stackoverflow.com/a/25741006/850347
Seems to be currently there is no way to achieve this. I have read this article and it suggests to recompile the Android source code with some changes.
Or, you can use visualizer.
https://stackoverflow.com/a/25816052/850347
The closest API available to you for these purposes is Visualizer. Which only captures "partial and low quality audio content".
Whereas on Android 4.2 and up camera2 allows me to grab the raw data stream from the camera, older devices only allow me to write the encoded stream to a file.
I ran Skype on some older devices (e.g. SDK 10) and was able to make a video call, which means that Skype must somehow be able to grab the unencoded stream before it is encoded and written to file.
I found some interesting articles on the web,
http://www.hnwatcher.com/r/1170899/Video-recording-and-processing-in-Android/
http://code.google.com/p/ipcamera-for-android/
https://github.com/NanoHttpd/nanohttpd/
but I don't see how this would be working reliably and across all devices.
I was able to read out the encoded file while Android was still writing the video, but the problem here was that Android writes the MOOV box to the end of the file and only when recording has stopped. So the information in the MDAT box is worthless before the file is closed.
Does anybody know of a library that I could use to grab the data stream from the camera and immediately use it as live stream? Has anybody tried to find out how Skype does this technically?