I'd like to implement a persistent cache for streamed audio data in my app.
I've scoured the internets and looked at the few existing solutions, most of them require you to create a local proxy, which writes the data to cache as well as serving it to Android's built-in MediaPlayer.
I finally came across Google's ExoPlayer which appears to do exactly what I want it to! I believe in order to create the cache I need to use CacheDataSource, however I can't figure out how to use it!
I've been through the Google documentation and demo app but they don't provide much info about caching at all.
Could anybody help me out and provide an example?
Never used this but the following tutorial on how to construct an audio renderer seems pretty straight forward to me. http://google.github.io/ExoPlayer/guide.html#datasource
May I ask why you want to implement persistent caching with a media player? When I made use of google's MediaPlayer framework it seemed pretty useful to me, never had a problem with streaming as long as I have a data connection.
I had also been searching endlessly for a solution for caching audio for the purpose of offline playback availability.
I finally found this library:
https://github.com/danikula/AndroidVideoCache
Also it is called Android Video Cache, it works for caching audio as well.
Related
I'm trying to find a way to separate and stream the audio from a youtube video given it's url/video ID. I've looked into youtube extractors and most of them are pretty old or no longer maintained. I'm working in Kotlin and jetpack compose. I found an app, newpipe, that does exactly what I want, but the codebase is huge and it's way beyond my experience. I'd like to play it in the background as well, so I suppose I would use exoplayer in combination with a background service. My only obstacle is obtaining the audio from it's url. Is there a quick and dirty way to achieve this within the context of an Android app?
i'm very very new to android/java and am trying to create a basic video player on android.
i have researched all throughout stackoverflow, google, youtube, etc but i really can't seem to find a way to do this..
my goal is to have a listview that shows all the compatible videos on the android device. then, after tapping on one of the items in the listview, the video would play.
i did find some very helpful resources, such as http://www.android-trainer.com/playing-with-videos-from-content-providers-part-3-.html , but this code uses the deprecated method managedquery()
i was then lead to this website: http://mobile.tutsplus.com/tutorials/android/android-sdk_loading-data_cursorloader/comment-page-1/#comment-15832 to find a way around the managedquery() method, but judging from the comments, this tutorial is ridden with bugs... i tried to debug the tutorial but i was hoping someone could give me a clear explanation of what to do...
should i focus on the first link and figure out how to go around the deprecated managedquery() method? if so, how do i go about avoiding the managedquery() method?
or is there another, perhaps simpler, way of creating a video player that can play all the videos in the device?
thank you
Where do you want the list to come from? Are you ok with using the default video player?
Where are the videos stored?
ManagedQuery (and cursorloader) is for traversing through a database. You don't need that.
I am trying to program an Android app that will be able to open the webcam and upload the recording live to another server.
Right now I have only found solutions where Android providing the stream on its port, instead of sending it. So to clarify, I would like to send the data to the server (upload).
I don't want to use a closed source program, but rather program it myself. I have some medium android programming knowledge, but the theoretical knowledge about how to accomplish this is missing.
Could anybody please point me out to the right direction.
Is this even possibe?
Regards
Edit:
Maybe some sort of RTP/RTSP setup would be possibel. I do not care about compatibility on android versions. So everything in that direction is welcome too.
Edit2:
Sorry to have been so unclear in the first place. I do have to implement it myself, but I can use existing code. What I cannot do is use already closed source implementations.
using MediaRecorder, you can capture video to a file. here's a post about it,
Android: Does anyone know how to capture video?
to "stream" it to a server, you could recorder a (never ending) series of short videos, say 10s each, and upload the chunks to the server. if you wanted to get fancy, you could have the server stitch them together.
Install Bambuser. Ask them what intents are available to launch it. Done.
If you really need the video stored on your own server, maybe you could make some sort of arrangement with Bambuser.
I am trying to develop an app/widget for which I need display the currently playing information (metadata) of an audio track.
This would be trivial if I was also writing the MediaPlayer myself, as I could simply access the MediaStore and bring up the info, however, I do not wish to compete with the plethora of existing apps on this front. I want to be able to pull this inforrmation from the builtin audio player or other app such as SongBird or PowerAMP.
I should be able to do this with PowerAMP using their [API][1], but have, but I really want a solution that works for the stock android player and others too.
I was hoping to be able to grab the information from the AudioManager, but that seems only to allow me to query the current state (Music is playing et) and I can set my intent to play music, etc... But no access to metadata from someone elses app.
So my thought is this cannot be done easily. My thoughts are that I could maybe access this info from the info bar at the top as the now playing info is printed up there. It might be an ugly hack though...
For a moment I got excited about the RemoteControlClient.MetadataEditor from 4.0, but then I figured out that it was for writing that information to a stream that can be sent to the physical remote, rather than allowing you to create a software remote. Damn!
Does anyone have any ideas?
[1]: http://forum.powerampapp.com/index.php?/topic/1034-updated-for-20-poweramp-api-lib-and-sample-applications/ Power AMP
I've written a guide for implementing this.
Basically, you need to have access to hidden classes of android.jar library. Then you have to extend IRemoteControlDisplay$Stub class, and implement it's methods.
After that you register your RemoteControlDisplay with hidden method - AudioManager#registerRemoteControlDisplay.
There is just way too much to explain in one answer, so read my guide on XDA-Developers.
Here is the link:
http://forum.xda-developers.com/showpost.php?p=44513199
Also, I'm currently working on a library which will simplify the process of implementing you remote media controls.
I should be able to do this with PowerAMP using their [API][1], but have, but I really want a solution that works for the stock android player and others too.
There is no documented and supported API for the AOSP Music app or the Google Play Music app, AFAIK. They certainly are not in the Android SDK.
I am not aware of an Android ecosystem standard for media players exposing this information, let alone a roster of apps that support such a standard. You are welcome to work with the developers of such apps and encourage them to create and adopt a standard.
My thoughts are that I could maybe access this info from the info bar at the top as the now playing info is printed up there.
It is not possible to spy on other applications' Notifications, for obvious privacy and security reasons.
For a moment I got excited about the RemoteControlClient.MetadataEditor from 4.0, but then I figured out that it was for writing that information to a stream that can be sent to the physical remote, rather than allowing you to create a software remote. Damn!
Surely there's a way to access the Remote Control Client metadata on Android 4.0, because the lock screen is able to access it when media is playing.
I'm not a developer at all, but I've tried to do a bit of poking around in the AOKP sources and this is my limited understanding of how it works. At least in AOKP (and presumably AOSP as well, then), it appears that the lockscreen uses core/java/com/android/internal/widget/TransportControlView.java to draw the music control widget on the lockscreen, which in turn uses media/java/android/media/IRemoteControlDisplay.aidl for data retrieval. At the very least, it may be useful to poke around in TransportControlView.java to see if you can figure out how the lockscreen widget works.
When I play certain MP3 files (such as lessons from JapanesePod101.com) on my iPod Touch, lyrics or transcripts that are embedded in the MP3 files are displayed in the media player. The lyrics are, I believe, stored as ID3/ID4 tags in the MP3 metadata.
I find this to be an extremely useful feature, and I believe I'm not alone. Despite that, neither the stock Android media player nor any other media player I've downloaded from the Market seems to support this. I just have not been able to find any way to get feature on my Nexus One.
This feature is important enough to me that I'm considering learning Android development just so I can write a simple media player that displays embedded lyrics or notes. However, the fact that nobody else seems to have done this makes me wonder - is it even possible? Is there something in the Android architecture or APIs that make it difficult or impossible to read and display lyrics information from MP3 files? I'd hate to get deep into the learning process and find out what I'm aiming for can't easily be done. (I mean, if all else fails I assume I could write my own MP3-decoder, but that's more trouble than I'm willing to go through right now).
I've already asked this question on the Android Enthusiasts Stack Exchange Beta Site, but in retrospect I decided it was more of a programming question and decided it was better to ask here.
Yeah, definitely more of a programming question. Just from my brief experience of reading through the ID3 spec, I think it's probably just that decoding ID3 tags is a complete PITA. I'm sure it can be done, as there are MP3 tag editing apps available for Android (whether any support lyrics or not, I do not know).
ID3v2.3 seems to have support for both synchronized and unsynchronized lyrics through the SYLT and USLT frames of the header. I imagine it's just such an underused feature that it isn't worth the effort to most to do so. Purchased MP3s don't carry this information (I've always wondered why not?), so they would have to manually be added (or automatically via a lyric service API, but there's a lot more coding involved with that).
Here is the ID3v2.3 spec if you'd like to look into it further...(abandon hope all ye who enter here)
The problem may be that most people would use the built-in mp3 playback mechanisms, and this may neither support lyric display nor be very easy to keep synchronized with something else doing lyric display.
So it may be that something needs to be written which does it's own mp3 decoding.
Most likely this would want to be done in native code. On the other hand, on android, audio output (and unless you use opengl, video display) pretty much has to be done from java. So you are looking at a fair amount of work to decode data with a native library and then dispatch it for playback and display from java.
So to answer your question - is it possible? Definitely
Is it made easy by the android APIs? - not really
I just added a new feature request that would give Android support for reading USLT in the ID3 tag. This will enable the native and 3rd party music players to display lyrics. If you want this feature, please star the request below, and post your comments.
http://code.google.com/p/android/issues/detail?id=32547