I'm currently fiddling around with an idea, and therefor I'm looking for a potential way to access the Audio-Stream on a TV (regardless if SmartTV, Android, WebOS, ...), run some audio-filters on it, and then have it output.
I've briefly read through some of the API's, but it seems I'm only able to control the volume - which is not really what I want. Am I missing something, or is this not possible at the moment?
For webOS at the moment you can only:
setMuted
volumeDown
volumeUp
and the formats that can be supported it at the moment:
So you cannot change the audio from native side, the only way i can imagine for this to work is to use a js library or something similar that can be supported then make change and display it to users.
Reference:
http://developer.lge.com/webOSTV/api/webos-service-api/audio/
Related
I am looking to make a mobile app that will allow the users to take X number of videos and it will combine them together to make a single video. Users will also be able to choose what to put in between each video recording and background music.
I have more experience with Xamarin/C# than with native Java/Obj-C but the only method I have found online that might accomplish this would be with using native with FFMPEG. Is this the case? Is FFMPEG even going to work for this? Is there a way to use Xamarin to accomplish what I need to do?
Thanks
Have a look at the AVMutableComposition and its related classes.
There's an example here, about halfway down the page: http://www.raywenderlich.com/13418/how-to-play-record-edit-videos-in-ios
It looks like it's covered by Xamarin: http://iosapi.xamarin.com/index.aspx?link=T%3AMonoTouch.AVFoundation.AVMutableComposition
I would like to know if it is possible to register a movement close to the camera, and then perform some actions inside an app.
This was possible on Symbian, so it should be also on newer Android-powered phones. The problem is I can't find any resource to get started with.
Does anybody have any ideas on where I should start?
Looks like there's a library created over on Google Code: http://code.google.com/p/android-motion-detection/
If you want something a little more powerful, you can also run OpenCV on Android: http://opencv.org/android Don't think you will be able to do anything detecting the distance without another camera or a different type of sensor (like sonar) though.
As a bonus, depending on what your are trying to achieve, there's even a nice API doesn't involve anything additional to the Android SDK: http://developer.android.com/reference/android/media/FaceDetector.Face.html
I am trying to develop an app/widget for which I need display the currently playing information (metadata) of an audio track.
This would be trivial if I was also writing the MediaPlayer myself, as I could simply access the MediaStore and bring up the info, however, I do not wish to compete with the plethora of existing apps on this front. I want to be able to pull this inforrmation from the builtin audio player or other app such as SongBird or PowerAMP.
I should be able to do this with PowerAMP using their [API][1], but have, but I really want a solution that works for the stock android player and others too.
I was hoping to be able to grab the information from the AudioManager, but that seems only to allow me to query the current state (Music is playing et) and I can set my intent to play music, etc... But no access to metadata from someone elses app.
So my thought is this cannot be done easily. My thoughts are that I could maybe access this info from the info bar at the top as the now playing info is printed up there. It might be an ugly hack though...
For a moment I got excited about the RemoteControlClient.MetadataEditor from 4.0, but then I figured out that it was for writing that information to a stream that can be sent to the physical remote, rather than allowing you to create a software remote. Damn!
Does anyone have any ideas?
[1]: http://forum.powerampapp.com/index.php?/topic/1034-updated-for-20-poweramp-api-lib-and-sample-applications/ Power AMP
I've written a guide for implementing this.
Basically, you need to have access to hidden classes of android.jar library. Then you have to extend IRemoteControlDisplay$Stub class, and implement it's methods.
After that you register your RemoteControlDisplay with hidden method - AudioManager#registerRemoteControlDisplay.
There is just way too much to explain in one answer, so read my guide on XDA-Developers.
Here is the link:
http://forum.xda-developers.com/showpost.php?p=44513199
Also, I'm currently working on a library which will simplify the process of implementing you remote media controls.
I should be able to do this with PowerAMP using their [API][1], but have, but I really want a solution that works for the stock android player and others too.
There is no documented and supported API for the AOSP Music app or the Google Play Music app, AFAIK. They certainly are not in the Android SDK.
I am not aware of an Android ecosystem standard for media players exposing this information, let alone a roster of apps that support such a standard. You are welcome to work with the developers of such apps and encourage them to create and adopt a standard.
My thoughts are that I could maybe access this info from the info bar at the top as the now playing info is printed up there.
It is not possible to spy on other applications' Notifications, for obvious privacy and security reasons.
For a moment I got excited about the RemoteControlClient.MetadataEditor from 4.0, but then I figured out that it was for writing that information to a stream that can be sent to the physical remote, rather than allowing you to create a software remote. Damn!
Surely there's a way to access the Remote Control Client metadata on Android 4.0, because the lock screen is able to access it when media is playing.
I'm not a developer at all, but I've tried to do a bit of poking around in the AOKP sources and this is my limited understanding of how it works. At least in AOKP (and presumably AOSP as well, then), it appears that the lockscreen uses core/java/com/android/internal/widget/TransportControlView.java to draw the music control widget on the lockscreen, which in turn uses media/java/android/media/IRemoteControlDisplay.aidl for data retrieval. At the very least, it may be useful to poke around in TransportControlView.java to see if you can figure out how the lockscreen widget works.
I googled around and found the regular speech-api from google. But I think this isn't what I need. I need continious voice recognition and the ability to launch other actions when a specific word is spoken. Is there anything in the android sdk that I can use?
If not: Is it possible to implement third-party libraries? (If yes: which - and what do I have to think about when implement a third-party-library?)
Edit: I thought about this again. I have to recognize just one 'word' (that probably won't be in googles-speech-databases). I have the chance to record it. That means, I'm able to continiously match the incoming audio-stream against my recording. That should work without a database. But I'm new to android-development. Do you have suggestions for APIs to use for recording and matching the recorded? Or is there any better way to continiously wait for a specifig 'word' to occur and then process any further actions?
btw: if that wasn't clear described: the app should continue to record and watch for the word to occure again when the reaction is done.
Is there anything in the android sdk that I can use?
No, sorry.
When I play certain MP3 files (such as lessons from JapanesePod101.com) on my iPod Touch, lyrics or transcripts that are embedded in the MP3 files are displayed in the media player. The lyrics are, I believe, stored as ID3/ID4 tags in the MP3 metadata.
I find this to be an extremely useful feature, and I believe I'm not alone. Despite that, neither the stock Android media player nor any other media player I've downloaded from the Market seems to support this. I just have not been able to find any way to get feature on my Nexus One.
This feature is important enough to me that I'm considering learning Android development just so I can write a simple media player that displays embedded lyrics or notes. However, the fact that nobody else seems to have done this makes me wonder - is it even possible? Is there something in the Android architecture or APIs that make it difficult or impossible to read and display lyrics information from MP3 files? I'd hate to get deep into the learning process and find out what I'm aiming for can't easily be done. (I mean, if all else fails I assume I could write my own MP3-decoder, but that's more trouble than I'm willing to go through right now).
I've already asked this question on the Android Enthusiasts Stack Exchange Beta Site, but in retrospect I decided it was more of a programming question and decided it was better to ask here.
Yeah, definitely more of a programming question. Just from my brief experience of reading through the ID3 spec, I think it's probably just that decoding ID3 tags is a complete PITA. I'm sure it can be done, as there are MP3 tag editing apps available for Android (whether any support lyrics or not, I do not know).
ID3v2.3 seems to have support for both synchronized and unsynchronized lyrics through the SYLT and USLT frames of the header. I imagine it's just such an underused feature that it isn't worth the effort to most to do so. Purchased MP3s don't carry this information (I've always wondered why not?), so they would have to manually be added (or automatically via a lyric service API, but there's a lot more coding involved with that).
Here is the ID3v2.3 spec if you'd like to look into it further...(abandon hope all ye who enter here)
The problem may be that most people would use the built-in mp3 playback mechanisms, and this may neither support lyric display nor be very easy to keep synchronized with something else doing lyric display.
So it may be that something needs to be written which does it's own mp3 decoding.
Most likely this would want to be done in native code. On the other hand, on android, audio output (and unless you use opengl, video display) pretty much has to be done from java. So you are looking at a fair amount of work to decode data with a native library and then dispatch it for playback and display from java.
So to answer your question - is it possible? Definitely
Is it made easy by the android APIs? - not really
I just added a new feature request that would give Android support for reading USLT in the ID3 tag. This will enable the native and 3rd party music players to display lyrics. If you want this feature, please star the request below, and post your comments.
http://code.google.com/p/android/issues/detail?id=32547