I just want to know if I can see several videos at once from the network using rtsp? I'm trying to do an android app similar to video surveillance and I need to see several videos at the some time in the screen, I tried to use it with MediaPlayer and after that with VideoView, but in the both cases sometimes the videos appear, sometimes give me an error that can not play one or more videos...What can I do to put it to work well?
What Cruceo said is correct. It's better to mux (FFmpeg is really great) all streams in one stream with a incredible resolution with a low bitrate & framerate. Then create a program to display it and make zoom in when you select a viewpoint.
A other option would be to change the video streams into jpg files. This way is allot easier because you can use your web browser to display it. With your web browser you can make use of JavaScript (ProcessingJS is very easy and good at this) to make a image viewer with allot of functionality.
Related
i am developing an android app for my ip camera, and the camera has some specific api commands that it can respond to. the problem i am stuck on is that i want to display a list of videos available on the memory card of the camera. I am getting the file list but i also want to get the thumbnails of those files.
The problem in getting the thumbnail is that i don't have any direct IP address of the video, the camera only provides me two things for accessing the video
1. RTSP URL of the video
2. Data stream of the video, so that i can download it in my code.
Can someone tell me how can i get the thumbnail of the videos if i have the above mentioned available options?
Note: there's also one API available in the camera for providing the thumbnail of the video, when i send that command it returns me one frame of the video, currently it is sending me the corrupt frame and this method is not working, that's why i am focusing on getting the thumbnail from the other two available options.
any help will be highly appreciated.
Thanks
You could open a socket and stream a few seconds from each video, saving the file locally on your Android device.
Once you have it there, so long as it is a recognisable video format, you should be able to create a thumbnail in the usual way (http://developer.android.com/reference/android/provider/MediaStore.Video.Thumbnails.html).
You would need to be careful to make sure that your app actually did not try to play these truncated videos, and went instead to the proper stream URL if someone wanted to view them. You could actually deleted the file after creating the thumbnail if you wanted to be sure.
Doing this may take a little time initially if your camera has a lot of videos, but you should be able to set it up to only create thumbnails for new videos once it has run once which should speed things up.
It is also possible to create thumbnails from streams directly using ffmpeg or VLC (e.g. https://superuser.com/q/592160) but I think you may find the above approach is simpler for your needs and it avoids you having to integrate ffmpeg etc with your app.
I want to stream video file (any supported format) from Android phone device and display it directly in another desktop app built using WPF.
I want also to make the same thing but video comes from Camera live.
For the camera, I found some solutions. one of them is this library https://github.com/Teaonly/android-eye but I have a problem with it because there is no direct url to stream. it has ip like this http://192.168.238.102:8080/ and it opens web page with settings and buttons and the video display url is http://192.168.238.102:8080/stream/live.jpg?id=58 which mean I should read images one by one. I don't know if this is a good streaming mechanism.
Also I found this article : http://www.androidhive.info/2014/06/android-streaming-live-camera-video-to-web-page/ but it requires server implementation on the .NET side and we need to buy a license.
I still did not find something for the video playing. And also I'm looking for a simpler way for the job.
I am trying to design a video website compatible with Android. A good example of what I'm trying to achieve is vimeo.com. They show a thumbnail of a video. When you tap it, the native Android player comes up in full screen:
Currently, I have an anchor to an FLV containing an h.264 encoded video:
click here to watch
When you tap the anchor on Android, it downloads the video rather than plays it. That's not what I want. How do I get it to play full screen in the native player like Vimeo? But unlike Vimeo, I would like the video to expand so that there's not so much black empty space around the actual video.
Ahh I see what you mean, clicking a Vimeo video opens the Android dialog of selecting which app should respond to that request (in my case just the browser (which downloads the file) or video player (which opens and plays it as you wanted)). This is normal Android behavior- if you have not defined which app should respond to a given request, it will ask you to select from among the supporting applications.
Have you even tried embedding a video in the way suggested through the link I gave you? You may find that it will have the exact effect the Vimeo video does. HTML5 <video> element on Android
EDIT: Actually I think your real problem is probably just that the file format you're using (.flv) is not among the core media formats supported by Android. http://developer.android.com/guide/appendix/media-formats.html
if you have the correct codec installed to play the video and doesn't work, check and make sure you have the correct mime types configured and that something in the registry or a file isn't overwriting.
use the old standard of defining mp4 and falling back to flash.
In mobile Safari and Android webkit there are javascript methods and events defined on the Video object that can help with this. There is another StackOverflow question dealing with this topic (for iPad, but I have used this on Android phones as well).
Web App - iPad webkitEnterFullscreen - Programatically going full-screen video
Mobile Safari documentation: http://developer.apple.com/library/safari/#documentation/AudioVideo/Reference/HTMLVideoElementClassReference/HTMLVideoElement/HTMLVideoElement.html
I've seen a number of similar questions, but so far I've not been able to get anything working.
I'm trying to playback a video (.MP4) from the res/raw folder and only get audio, no video. The video is short (about 3 seconds), small (350KB) and if I put the video on the device (Motorola Droid) directly, it plays fine. It also plays fine if I modify the app to read it from the SD card rather than the resource folder. The behavior is the same on both the emulator and the actual device.
Unfortunately, I need this video to be an intro shown just prior to the main screen for my app, so it has to be part of the package. Additionally, one of the app requirements is that the app is available offline, so I can't stream from a web server. I've tried a VideoView, SurfaceView and MediaPlayer, none of them work.
Is it possible to playback video from the resource folder? I've read something about compression possibly screwing the video up, is it possible to manually decompress the video and then play it, and if so, how would that work? This seems like it should be a pretty basic operation, am I just missing something?
I was having the same trouble, tried everything too, agree it should be easier... and documented. Just fixed by passing VideoView.setVideoUri a string with the android.resource protocol, as described here.
The first option doesn't work for me, but the second does:
Uri uri = Uri.parse("android.resource://[package]/"+R.raw.[video_resid]);
Hopefully works for you too.
I'm researching the development of an Android (2.2) app/service that will enable users to record short (I do emphasize short, < 30seconds) video on their phones and then upload that video (HTTP) to a server that will then transcode the video to other formats. That same user can download videos from other Android users and play them.
Now, I get a bit lost with everyones recommended approaches to all the issues in doing something like this because I haven't seen any ask this in a cohesive context. Ideally I would like a non commercial solution to this (as in no vendor/service being needed for the the video hosting/transcoding), but, feel free to include those as a recommendation (I've marked this as a wiki) as I know many like to use youtube and vimeo for the middle layer in all this.
The questions are
What server technologies do you
recommend for hosting and
transcoding?
What technology do you
recommend for streaming the video (it
would be nice to offer a high and
low quality encoding depending on
the users network connection)
What video format and software do you recommend for converting the uploaded video on the server to be viewable later by other Android owners.
Im assuming it's bad to do any transcoding on the phone prior to upload (battery/proc issues), but, if I'm wrong with that assumption what do you recommend?
Some things that may help you...
The video will only need to render on an Android device, and in the future in a webkit html5 browser.
Bandwidth isnt cheap (even with numerous 30 second videos), so a good mix of video quality and video file size is important (streaming if needed to ensure quality vs. download).
This is for android 2.2 devices with a video camera of course and medium to high density screen of 800x400 min.
Open source solutions (server to receive the uploads, code to do the transcoding, server to do the streaming) are preferred, but not required.
CDN's are an option, but I don't think that really figures in to the picture right now.
Check out this page to see all the video formats that Android supports for encoding and decoding.
http://developer.android.com/guide/appendix/media-formats.html
For encoding use FFmpeg or a service like encoding.com