In android i am trying to develop an application where I can open messages or make a call etc. through voice. But I want my application to keep running in background, how can I convert the Google speech to text API into a service ? I will provide my work progress only if you need it, as posting it did not help me in previous questions. I have taken help from this link so far I will share my code if you want. http://www.androidhive.info/2014/07/android-speech-to-text-tutorial/
Related
I have done a lot of research, but couldn't find the answer. Currently I'm using agora.io.
What I mean :
I want to add into my android app voice call, but on the same screen.
It has to be possible write some messages during a call without ending it, like at this screen
Is it even possible to do it with agora.io or should I change the lib to another one?
Agora video sdk can achieve that. You can use RTM sdk for messaging functionality and RTC sdk for video call. To put them in the same screen, you need to put the video ui on top of the messaging ui.
This is my first question here on stackoverflow, so sorry if i am doing something wrong.
I have an app on android wear, where the user shall select an item from a list.
I want to enable the user to select the item via voice commands. I managed to do it as suggested in the google documentation, but i have to implement a button to start speech recognition, and it will show up as a fullscreen activity.
I want the user to be able to see the list while speech recognition is active. Is there any solution for this?
EDIT: I might have found the solution i was looking for. It is the speech recognizer, which seems to be able to do just what i want. I will have to dig into that and will update this post if it is the solution.
I have an app where I detect tags and my question is :
I want my app to do an action when detecting new tags, in this case, opening a webpage. It works without problems but only when the app is on the foreground, but I want it to work even when the app is running in the background. I read about Android Services but I'm not sure it is what I can use.
Can you give me directions to find a way to solve that problem?
Thank you very much
Please see AsyncTask from google documentation. Here is the link https://developer.android.com/training/best-background.html
This site has an example on it
http://www.sitepoint.com/scheduling-background-tasks-android/
I have an Android app that I would like to add the ability for the user to "cast" what is displayed on the app to a Chromecast. It could just be a local JPG but I would prefer the user to actually see actual "live" content of the app. Does anyone know if this is possible? I know there are apps like AllCast but wasn't sure if they were using supported features of the SDK or if it was a hack. I found some mention of the Default Media Receiver but could not find any documentation on how to use it with local content. Any advice or direction would be appreciated.
There is no Cast api to do that directly; you can look into WebRTC or something of that nature.
The way I do it is to use the Presentation class. The only problem is that you do need to use the ChromeCast app to start screen mirroring before you start your app.
I have not yet found a way to start mirroring my app (or, to be more precise, to show the contents of your Presentation class) from a ChromeCast UIButton within my app, even though I have been able to get that cast button working and connecting ... just not to start app-mirroring when using only my in-app chromecast button.
I want to use Facebook ticker to show currently played audio file name of my android app Like spotify.
I want to continuously change the ticker as audio file changes
I search on google i dont find any tutorial or samples. please share how can i achieve this ticker into my app I am new please guide me Waiting for reply
thanks
You must implement your application with Open Graph API. There is a built-in action called "listen".
https://developers.facebook.com/docs/opengraph/actions/builtin/#listen