I am developing a Google Glass application.
One of the things I want to do is to stream a video (RTSP) to a google glass activity.
I wrote a program that does that based on this guide - http://code.tutsplus.com/tutorials/streaming-video-in-android-apps--cms-19888
And it works!
The only problem i have is that it takes a few seconds for each update of the video to appear on the glass.. What am I doing wrong?
The only idea I have is to use a LiveCard and render the image there. Maybe LiveCard should work faster? any thoughts?
Thanks!
Related
I'm trying to make a radio app. Everything is ok, except playing radios. I'm getting play link from a page with Jsoup. When i press play, (in real device)it takes 40-50 seconds to start playing. But in genymotion device (Google Nexus 4- 4.4.4- API 19) it takes just 3-4 second. I couldn't find what is wrong.
I'm new on Android and I couldn't find a solution for days. Thanks id advance.
I really recommend you to use google exo player instead of default android player for streaming. its too fast and you will never have problems like this.
https://github.com/google/ExoPlayer
You must try it without emulator, as they themselves are slow.
Also give below links a read.
https://developer.android.com/reference/android/media/AudioManager.html
https://developer.android.com/guide/topics/media/mediaplayer.html
For Best performance native is good:
https://developer.android.com/ndk/guides/audio/index.html
I want to stream a 360 video from YouTube in my android app in such a way that it is compatible with Google Cardboard- that is, I want the video to be streamed side by side like in CardboardView.
I have been searching the internet about this and I don't seem to have much luck with it. So any amount of help would be heavily appreciated. Thanks in advance!
I am working on my Glass app demo and I used droidAtScreen to project the screencast from MyGlass for the presentation. The problem is that I cannot demonstrate the voice responses from the Glass based on user input. My backup plan is to record a video for demonstration and insert the voice output manually. Does anyone know if there is a better way to do both screen and audio cast for Google Glass app demo? Thanks for the help.
Have you tried Android Screen Monitor? I always use ASM.jar for any demonstration and it works fine with both audio and video demonstrations.
The link to download ASM.jar is here.
Detailed description is here. If you're using Droid#Screen than probably you know how to run Android Screen Monitor (ASM.jar), but here is a link for a reference that explains the process in detail.
This is how I solved the problem initially.
Use Screencast-O-Matic to record the video of screencast on my
laptop. The screencast is done using DroidAtScreen with
highest frame rate possible option checked. It has better frame rate than ASM screencast. During the video record session, my voice was captured. (so in other words, choose a
quiet place!!)
For simulating the Text-to-Speech engine voice, I used SitePal
demo site and the voice is Julie (US). It's the closest voice I could
find that matches the Google Glass speech engine. To record the
voice, I used Audacity and export it to .wav audio file. The key
is to play the video and find the exact time to insert the audio
file using any standard movie maker software.
UPDATE
Just finished the demo presentation at IT Expo. To my surprise, the simplest solution worked the best.
Create the video demo (under 2 minutes) as mentioned above but insist
on asking an audience to try it out.
Ask the person to say what
he/she heard from the Glass app as a response to the action (ex. The
item is saved)
I need a web based video player that provides "time duration for which a user played a video" and "instance at which the video stopped playing".
It would be great if it can provide details like the time at which the video was started (if he has skipped to a certain location when the video has started) and the stop time of the video.
Could anyone please suggest a ready made player available in market which provides all these information OR some alternate approach to achieve the same.
I want to achieve the same for iOS and Android devices. Is this something possible? And if yes please help me on its feasibility with Phonegap?
Thanks in advance
You won't need a speical player for this. The mediaelements audio and video already have a property called played.
I'd like to know how I can call a video from an external server database and play it in android? any helpful tutorials?
another question, when I play my video in emulator is it normal that it gives me a low quality and plays very slow..like moving in slow motion?
Emulators not responded quickly while you are playing video.I hope you understand that.