RecognizerIntent.ACTION_RECOGNIZE_SPEECH on Samsung Galaxy S - android

I am trying to use speech recognition on a Samsung Galaxy S phone (as I know the emu doesn't have the intent). The Galaxy S has a 2.1 Android ROM. For some reason I get that the package does not exist on the device, which doesn't make sense because other apps (google maps, voice dialer, etc) are clearly using this.
Does anyone have any ideas on how I can get this to work?
The code is more or less the same as google's example (http://developer.android.com/resources/articles/speech-input.html).
As a further note, I found this thread which seems to indicate that the srec library is missing randomly on devices:
http://groups.google.com/group/android-discuss/browse_thread/thread/2a53ec01bdff8e67
Is there a way I can do this manually (i.e. contact Google's SOAP API for speech recognition)? Alternatively, can I just copy the srec source code from somewhere and put it directly into my project?
Thanks.

Not a total expert on this, but I do know the actual recognizer task is performed on remote google servers (the voice feature will be sent out). Speech engine itself requires significant amount of memory and computing power. On the device itself, it is only possible to perform limited grammar tasks (for example, call XXX).
Maybe the device manufactures/operators don't have agreement with google?

Related

Using Amazon Firefly SDK on Android

I have been doing some heavy research into the field of Visual Search, and I tried the technologies from Google (Goggles), Amazon (Firefly), and other vendors.
I can say that FireFly is actually the best, because its instant identification (no need to snap a photo and send it to some server for processing), plus its able to identify products accurately without having to scan their barcode, which is fascinating.
The thing is, Amazon exposed the Firefly SDK but only for their phone's Fire OS. And you can't use it for other Android development.
However, I am pretty sure this is not a hardware limitation, because Amazon has an app called Flow which runs on Android and iOS which uses the same identification technology, so I am sure any camera can be used and not just the one on the Fire phone.
Does anyone know if it's possible to use the Firefly SDK somehow on Android? I know this might be impossible without some sort of reverse engineering for FireOS, but even so at least it would be technically possible!
Thanks for your response in advance.

Android Activity Recognition on Wearables

I'm researching ways on how to do activity recognition using an android smartwatch. Currently, I'm focusing on detecting whether the user is walking or standing still. My first idea was to use the built in step counter, but then I came across the Android Activity Recognition API (I'm relatively new to Android^^) which seems to be used in mobile apps only.
I'm now stucking at answering the following questions:
Is the current API already making use of a connected wearable device?
(e.g. automatically accessing built-in wearable sensors)
Is there a seperate API available for Android Wear?
Is there any other best practice on how to use wearables for activity recognition? (especially walking and standing still)
During my research I've already tried the following things:
Reading through the Android Activity Recognition Guide
Reading through this article about Google's Activity Recognition API
Implementing a simple Android Wear App which uses the current Activity Recognition API. I tested the app on my LG G Watch without success. It seems like I can connect to the ActivityRecognitionClient but I never receive any activity updates. I tried the same code on my Nexus 5 - everything works fine.
Reading through this post about Google Play Services. Here the author is like "...We like the Activity Recognition API for Android Wear, as we’ve always thought the location tracking technology was a great backbone for this type of functionality...". So according to this, there is a seperate API, right?
I would be very thankful for any helpful information from you guys. In my opinion, a cool thing (see first question) would be to automatically detect a connected wearable device and use its sensors for enhancing the accurancy when the mobile phone is unsure about the current user's activity.
You ask
Is the current API already making use of a connected wearable device?
(e.g. automatically accessing built-in wearable sensors)
No, and this would not make sense would it? The wearable and handheld device is not always carried at the same time; the watch can be moving, and the handheld still. (vice versa) I am not sure what the value of a combined measurement would be.
Is there a seperate API available for Android Wear?
Yes. google provide a different Google Play Services library or wearables you see this in the compile dependencies;
compile'com.google.android.gms:play-services-wearable:6.5.87'
vs
compile 'com.google.android.gms:play-services:6.5.87'
So, when you tested the API in your first Moto360 app, you actually imported the play services libraries meant for handhelds instead of the wearable version. The constant "ActivityRecognition.API" is not included in the wearable version of the client API.
Is there any other best practice on how to use wearables for activity
recognition? (especially walking and standing still)
One way would be to use the raw accelerometer data to detect motion. It is fairly easy to detect that the device is not moving at all, to detect anything else is not trivial.
You could push sensor data from the wearable to the handheld for processing there if you like. Ping me if you'd like some code showing just that. I don't want to post it since it is not relevant to the question.
My guess is that Google will include this API on the handheld device in the future. Spending a lot of time "rolling your own" might be a risk...
Unfortunately, the activity recognition API is not yet implemented on Wear devices. When I tested a simple ActivityRecognitionClient API example program on my Motorola Moto 360 (with "4.4W 2"), I got a message indicating that on the logcat stream.

Can I use the Android/Google speech recognition software on another platform?

I have just acquired an Android phone recently... wonderful stuff. Starting to look at the OS guts and how to program the thing.
The voice-recognition-for-dictation is good too... given that this is an open-source OS, is there any way of harnessing the Android-Google speech recognition? My current understanding is that the voice trace has to be sent to the Google servers to be processed, i.e. the software is not on the machine. But I may be wrong!
Either way, does anybody have any idea whether such harnessing for one's own apps (on Android or another OS on a full-size 'puter, for example) is possible?
If you are talking about using voice recognition in your code somehow, then you can use it with the help of SpeechRecognizer class(http://developer.android.com/reference/android/speech/SpeechRecognizer.html) and RecognizerIntent.
But you can only use the currently existing functionality to some extent only.
About the confusion as to whether it lies in device or not, try using your Voice Recognition after turning off internet on your phone. It wont work.
You can also look into API Demos for some example:
sdk\samples\android-10\ApiDemos\src\com\example\android\apis\app\VoiceRecognition.java

Analog video capture to Android phone

I am looking for a way of displaying an analog video stream on an android phone. On a pc/mac/etc you can achieve this using a cheap usb analog-digital converter such as a grabby: http://www.terratec.net/en/products/Grabby_82248.html, and then view on VLC, for example.
Would such a thing work (in theory) on android if the proper drivers were available? (ie. are there any hardware issues which make this impossible?)
Does anyone know if such a device with android drivers is available?
Ultimately I want to make an app which interfaces with the grabby (or similar device) and allows the user to view video on the android and capture and send short clips.
First of all the Android device needs to support USB Host. This limits your userbase significantly.
Then there is the problem with power. Some USB Host devices will be incompatible simply because an Android phone will not be able to push enough power through to port to get it running properly.
I'm not sure about the drivers, but I'm 99% sure it won't work "out of the box".
You should certainly take a look at THIS project. It is pretty similar to what you are trying to do. Maybe you should consider getting in touch with that person.
EDIT:
Based on what it took to get that DVB-T dongle running in the project i mentioned above the chances of creating an app that everyone will be able to simply download and use are EXTREMELY slim. Getting that dongle running required using a modified kernel and special scripts. Of course I could be wrong. You can continue the research yourself or wait for someone with more experience than me to reply.

How do I output to monitor from android device

I am trying to do a demo on a android device, but the screen is too small so is kinda hard to do a demo let say in a meeting room with 12 people. Although I can pass the device around the table or just simple borrow or get more devices for the demo purposes.
I understand there are devies where you can buy special USB converter to do TV-out like in iPhone, and some specific devices on Android (e.g. Motorola Incredible?) But I have to demo on a specific device where it runs standard Android build.
I understand I can do it on Android emulator but the screen refresh rate is too slow, as it will send the wrong message to the audience that the app is slow. (Or there is a way to increase the screen refresh rate for emulator?) Furthermore the emulator doesn't support multitouch. (Or am I wrong?)
Not sure if anyone
You do not have many options.
You can use Droid#Screen, but the refresh rate on it is maybe 6fps. I am not aware of any other software projector that is faster.
You failed to mention the "specific device" that you are using, so I cannot comment on whether it has TV-out capability. The HTC DROID Incredible and the Samsung Galaxy S series support composite output -- I use the DROID Incredible for this purpose a fair bit. Most of the devices that have HDMI output only support it for certain built-in apps, such as the video player.
You can rent or purchase a device projector, like an ELMO. These are fairly expensive pieces of equipment purchased new, though I see a handful of used ones on eBay at interesting prices (though watch out -- many seem to lack the AC adapter).
If you can delay the demo several months, you may be able to use a Google TV.
And that's about it, AFAIK.
Or there is a way to increase the screen refresh rate for emulator?
Get a faster computer.
Furthermore the emulator doesn't support multitouch. (Or am I wrong?)
I am not aware of a way to simulate multitouch with an emulator, though I have not gone looking for a solution there.
If you have an Galaxy S3 Android mobile phone, you can use Mobizen. It's free and the screen refresh rate is relatively good. You can control you mobile phone from you computer using your mouse and your keyboard. It's working using USB, 3G or Wifi connection.
I have used this Android screencast tool: http://code.google.com/p/androidscreencast/ in past demos, but again the downside is the relatively slow refresh rate.
If you have a rooted device, you could try Droid VNC Server (it's on the market). The refresh rate isn't too bad, but I certainly wouldn't want to demo full motion video or an arcade game on it.
You could also get a webcam, rig it up with a tripod. Something like this. Downside is your hands will be in the way, maybe issues with lighting and/or focus. Upside is a decent refresh rate.

Categories

Resources