We are trying to move some of our media files from our application to a server, but we can't seem to figure it out. I created a video, embedded it in my app, and it played fine. So I sent that same video file to someone to put on the server. I then went to my browser and typed in "http://www.server.com/Android-app/testVideo.mp4" and I get an error saying "Sorry, this video is not valid for streaming to this device." I am using a Samsung Moment. So I was thinking it was something wrong with the server. But the video does work on someone's Droid, so that makes it sound like a phone compatibility issue. But it is the exact same file that worked on the phone earlier.
Is there a setting (on phone or server) that needs to be changed? I just can't make since of it. Thanks for any help.
You haven't really given enough details to give a definitive answer, but the first thing I'd do would be to check the Content-Type in the response header.
Related
I have a plan to develop an instrument app, when we shake the android phone, it will produce "angklung" (Google it) sound.
THE PROBLEM:
How to make one android phone can share its produced sound (by shake
gesture) to the other android phones having my application?
The connection that I want to use is mobile data connection and wi-fi.
I think this person has the same problem, but I don't know how to communicate with him. Stream android to android
But there is no help..
I need solution/example/suggestion for this problem. So far I succeed to produce the "angklung" sound when it is shaken.
I have no idea how to start this application. I've searched in the internet but there is no help :(
Thanks for your help.
I would give you the suggestion of streaming the audio data to a server and beaming that to other android devices (that are registered to your app). As the question/issue you have asked are way bigger than couple of lines code, hence am pointing you to some good resources, dig those deep & good luck.
Live-stream video from one android phone to another over WiFi
Stream Live Android Audio to Server
I have a little app written, where you can choose different medias like video, mp3 and pictures.
For testing I have added some sample media like
Video - http://distribution.bbb3d.renderfarming.net/video/mp4/bbb_sunflower_1080p_60fps_normal.mp4, an mp3 file, and a picture.
These samples are working fine in chromecast.
Furthermore for better understanding I have not written a new receiver for chromecast. I am using the standard receiver from google.
Now I have a webserver running on my android phone. I have "installed" the correct mimeTypes "video/mp4 mp4" and when I enter the URL from my android phone in my webbrowser chrome on my laptop with the requested file (192.168.0.12:5555/myvideo.mp4) - it is working.
So Chrome plays the file correctly.
The file myvideo.mp4 is the same video from the sample link:http://distribution.bbb3d.renderfarming.net/video/mp4/bbb_sunflower_1080p_60fps_normal.mp4. It is not another video.
But when I take this link (192.168.0.12:5555/myvideo.mp4) and send it to chromecast, chromecast tries to load the file and than goes back to the default screen, the same screen chromecast shows after successfully establishing a connection from my app. (Standard site from google)
I am working in Eclipse. It is not a programming problem. No errors.
I hope anybody can explain me, why chromecast doesn`t play my video, mp3 or jpg from my webserver, but plays the same files from the web?
Thanks
Paul
//EDIT
In the chromecast debugger I get the information:
[151.045s] [cast.receiver.MediaManager] Load metadata error
I dont know what i should do with this information. Also google havent heard this problem yet.
Thanks for any help
//Solution
I have put metadata to the mediafile and some metadata was false written. Eclipse didn`t show me an error, but chromecast had a problem with that.
So everybody who has the same problem, check your metadata of your file.
Hope I can help anybody with that
I'm currently working on an app that will stream video from one phone, to another using wifi-direct.
I've already installed and tested the sample wifi-direct app that comes with the SDK. It works great sending images from one Nexus 4 to another.
Some problems I've experienced:
I tried to have the app send a video by changing the "image/*" string to "video/*" wherever it appeared. And ".jpg" to ".mp4".
After running the app with these changes, I am still able to connect devices, but one N4 is stuck at "Opening a server socket". And the video I took never gets sent.
Perhaps I went about it wrong.
Regardless, my real goal is to stream video from one phone to the other using wifi-direct only (no data connection / wifi).
Could someone help me figure out what steps I should pursue in order to accomplish this? Should I start with the wifi-direct sample app as a base? Or should I try to write it from scratch myself? How should I go about streaming the video using wifi-direct?
Thanks!
Currently I am using webview to project display from one android to another and its working fine but I cannot listen to any audio/voice .I dont know how to make it work in android.Can we ambed vlc in android in custom android app or any other app like ustream anything would do.Basically I want video/audio from one android phone to another .How do I do that ??
Using this app on client : https://play.google.com/store/apps/details?id=com.pas.webcam&hl=en
Unable to listen to audio on Server.
Please help me out ?? I am stuck after googling for many days and only ip stream with video and on wifi is working on 3g it lags a lot .
I really appreciate the help to guide/inspire/motivate to any direction where I can implement the above requirements.
Thanks again.
Write to a Socket.
One thing, you could do is to get the AudioRecorder class to stream to a LocalSocket. This way as and when you have bytes flowing in from the AudioRecorder, they're pushed to LocalSocket.
Can someone improve upon my answer, I've just seen this done before, never really tried it myself.
PS: The class names here are from memory, might be a little off on that.
I have been working on an android app that streams videos live on a server using android built-in camera and anyone can watch that live stream from my website which is deployed on the server.
So can any one help me on how should i start working on my project because at present i have no direction to start with.
More specific example is:-
Like a person goes to a picnic and he wants his friends and family to see whats going on with the tour and his family can see live what he's doing live.....
There is an open-source project that does a very similar thing:
http://code.google.com/p/ipcamera-for-android/
It basically uses the LocalSocket of the camera to read the video and stream it from a webserver. You should be able to find lots of information in the source code.
If you want to stream over the internet, for everyone to see i can recommend you the service justin.tv which lets you broadcast you stream to the whole internet. If tried it, and it works very good!
However, if there's no wifi you will probably have a very laggy connection, unless you convert the video in a smaller size...