I have a question whether I can remotely control VLC video player program (play, pause, sound, maybe some video streaming, cam streaming) between my computer/mobile phone.
Here is my plan:
1. VLC player on Mac OS
2. writing some TCP Server (C++)
3. writing client on side of android mobile phone
here i consider writing in C++ in order to use it in android/ios ?
4. writing application on Android with simple buttons that can control remotely this player...
Can this solution work properly?
Some additional questions:
1. Can such solution work over WAN (Internet) not only LAN (TCP socket communication)
2. VLC player has in preferences Interface > Main Interfaces > RC
and Lua HTTP, Lua Telnet, etc. (whats the aim of this? )
3. I saw some application on Google Store that comunicates via Lua HTTP?
I would prefer writing my own server/client + protocol for communication this is for university lower degree project.
So my question is whether if I will write such a server, Will there be possibility to integrate it with VLC somehow like adding to preferences > Interfeaces, or it should be separate program or it can be written as plugin or some add-on ?
In summary I need some help to direct me which solution will be the best to provide the most seamless interaction with VLC and have own server, client, protocol in order it hasn't been to easy (I saw in documentacion that possibly there are simple commands in VLC over HTTP protocol which I assume could allow for easy interacting with VLC).
I think also about extending this project by enabling mouse move control on Mac OS / Windows. What should I need for it?
The last part is to enable streaming video to phone and maybe in opposite direction from phone to VLC player. Also web cam capture streaming from phone to VLC and oposite Mac book to phone will be interesting solution.?
thanks for any help
PLEASE If it is too long question please concentrate on answering whether it is possible to do, and whether it can be seamlessly integrated in such way that end user shouldn't have to make many hours of configuration...
Best solution form my point of view:
- preference screen of my plugin embedded in VLC player settings
- writing TCP port/ host (maybe using current host IP in local network)
- on mobile side detecting and connecting via this host:port using client and it just works...
1. VLC player on Mac OS 2. writing some TCP Server (C++) 3. writing client on side of android mobile phone here i consider writing in C++ in order to use it in android/ios ? 4. writing application on Android with simple buttons that can control remotely this player... Can this solution work properly?
Yes it is possible and it works perfectly. VLC has in built server, so you do not need another server app to control it. you just write a client side app for android or Windows/iOS. However if you still want to write server app, you can do so (I don't recommend it), but obviously communication delay between client app and VLC is higher than usual.
1. Can such solution work over WAN (Internet) not only LAN (TCP socket communication) 2. VLC player has in preferences Interface > Main Interfaces > RC
and Lua HTTP, Lua Telnet, etc. (whats the aim of this? ) 3. I saw some application on Google Store that comunicates via Lua HTTP?
Yes it should be possible, but I haven't tried it though.
I would prefer writing my own server/client + protocol for communication this is for university lower degree project. So my question is whether if I will write such a server, Will there be possibility to integrate it with VLC somehow like adding to preferences > Interfeaces, or it should be separate program or it can be written as plugin or some add-on ?
As I said, you can write your own server app, you can integrate that server with VLC server (web Interface). Again this method is not recommended.
If you still want to write server app, instead of integrating with VLC's web interface, map key board short cuts (for example, receive stop request from your client app, and rise keyboard 'S' key event on your server app. 'S' key is short cut for stop command for VLC. For more VLC short cut keys refer here)
VLC supports both transcoding and streaming, I suggest you to write only client app, integrate it with the VLC Web interface. That is the best method. (For more info, there are many apps on play store, try any one of them or refer VLC forum)
Related
I would like to stream a video between two android devices (android-android). There wouldn't be any server, so the streaming has to be direct between devices. Devices would be in the same network so they could communicate via WiFi.
I've tried using MediaRecorder - MediaPlayer via sockets, but I've received many exceptions.
I also looked for library, but I just want to stream a video between two devices directly.
Any solutions?
If your video if for real time communication, e.g. a web chat or sharing some CCTV in real time with minimal delay then a real time video communication approach like WebRTC would be one additional possibility - this type of approach prioritises low latency over quality to ensure minimum delay. See here for Android WebRTC documentation:
https://webrtc.org/native-code/android/
If the requirement is just to allow one device act as a server for non-real time videos then the easiest approach may be to use one of the available HTTP server libraries or apps to allow one device act as a server that the other one can simply connect to via a browser or player. An example Android HTTP server that seems to get good reviews is:
https://play.google.com/store/apps/details?id=jp.ubi.common.http.server&hl=en
I am studying computer programming, but I feel that at school we are doing only stuff that work on a computer without having any connection to the outside world and it isn't all that interesting, so I gave myself a challenge. I want to make a program that streams footage from a surveillance camera to a phone. It seems a bit of a long shot, but I want to try. The thing is that, as I said, at school we are doing pretty basic stuff and with a project like this I have no idea how to even begin. A simple Google search didn't help at all.
I know how mobile apps are made, but I have no idea how to connect such devices.
Any ideas?
I have experience only in RTMP streaming so I am going to explain what I did then.As far as I know, there are 3 components required for streaming in RTMP:
The source - a device with the webcam that is going to generate the video streams. You might need to learn flash scripting language called ActionScript 3 and Flex to write an application that will help you to send stream to a server.
The Media Server- The source device sends its stream to the media server (in some cases, the media server is the source). You can easily do this with action script after you set up your server. If you want to have a local media server then you can install RED5 media server which is based on Java and is open-source.
Client Application- The client application can be a mobile application or a flash based web application which will connect to your media server for the stream and will display it on the client's device.Even this can be written in ActionScript/Flex.
Here is a tutorial on how you can download and setup Red5 media server:
http://www.technogumbo.com/tutorials/Red5-Media-Server-Development-Setup-Tutorial/Red5-Media-Server-Development-Setup-Tutorial.php
You'll need to learn ActionScript / Flex for client side application and some bit of Java for server side.
Updated
I may have found the solution, I can probably make a mobile app with phonegap that talks to my rails app and make a lighter version of a mobile version of the rails app
I may be getting an educational rails application to work on soon. The client is an educational consultant, she wants to build a rails application and one of the feature is that a teacher can record a feedback that belongs to a particular student for an assignment. The app needs to be able to record an audio and play an audio.
I have been researching this feature both on stackoverflow and google but I don't find the answers very complete.
I got a couple of options,
1) I have found this blog, a developer built an audio recorder/player with flash http://cykod.com/blog/archive/December2010
and basically its teaching me to grab the audio from a browser with a microphone with flash (needs user permission) and then send it to the server therefore you can escape using red5 (media server).
She wants it to be mobile friendly. I don't think the recording works on the android phone because of flash and I am sure the android phone lacks a microphone. I don't know much about ios either because I don't specialize in mobile. I got my android phone to download flash so it can play the sound.
2) using red5server but I think the player/recorder will still be flash and it doesn't work on the phone
3) http://www.sajithmr.me/jrecorder-jquery and jrecorder
I got some solutions for rails but is there a mobile friendly solution ?
I am sure the android phone lacks a microphone
A phone without a microphone would be pretty useless! :)
With Android you can use MediaRecorder to record and save an audio file. I'm sure it works, I've done it before.
Although I have never tried, I think you can do it in PhoneGap with the Capture options and on iOS with the Audio Video Foundation framework.
That's for the mobile part. You don't need flash there, there are much better options.
On your web application, you can surely use Flash to record audio from the computer's microphone.
However, if you're lucky enough your users will use recent browser that support HTML5 audio recording and playback. Check out capturing audio tutorial and libraries such as audio.js.
HTH
I'd like to add NAS support to one of my application, and it is critical that the application is capable of streaming the content and that it not has to download it to the device. The application will be streaming video content, so once the video is over, there shouldn't be any large video files on the device.
What I've tried so far:
jCIFS - Works beautifully, but isn't capable of streaming (to my knowledge). I've successfully created video files on the device using jCIFS, but they're still there, when the video playback stops.
Temporary files - I know that Android is supposed to be able to support temporary files, but I'm not sure how it works or if it's any good in this situation. Just a thought, basically.
My application must be able to launch a video intent with a video on the NAS device, and it should be playable in any video player. I know that some applications on Market support NAS devices (and SMB / CIFS connections), but I don't know how it works.
Any suggestions or ideas would be much appreciated.
I think you have to create an in between http server as discussed in the following question: Android ServerSocket programming with jCIFS streaming files
I am trying to do the same thing, already tried VPNC, no luck.
Now I am trying to use neo router my android can connect to the server but isnt able to browse any files. It might work for you. search for VPN setup (your router name here) android. It will give you some stuff to try.
I've been contemplating (re)building an app on iPad for some time, where I would use objective-C and DSMI to send MIDI signals to a host computer. This is not bad (I mean, except for actually writing the app).
Now I'm contemplating perhaps developing the app for Android tablets (TBA).
In Java, what options are available for MIDI message communication? I'm quite familiar with javax.sound.midi, but then I would need a virtual MIDI port to send messages to the host.
On the other hand, if the app were done in Adobe AIR, what options would I have available for communicating with MIDI?
Obviously another option is to send/receive messages over a TCP/IP socket to a Java host, and talk that way, but it sounds a tad cumbersome... or perhaps not? DSMI does use a host program, after all.
javax.sound.midi is not available in android.
The only access to midi functionality in android is through the JetPlayer class and its very much designed for use in games. Android will play a midi file but its unclear what code path it uses, theres no way to write to the midi hardware from memory.
In one app I've made i needed to play notes dynamically based on the GUI/user interaction and ended up having to use samples and pitch filters to create the notes.
Sounds to me like what you need to do is port DSMI to android, its open source the iPhone library looks pretty simple, shouldn't be difficult to port over.
EDIT:
After thinking about this for second you wouldn't gain anything by using javax.sound.midi or whatever midi functionality exists in AIR anyway. All you need to do is pass MIDI messages through a network link to another device who is responsible for communication with the actual MIDI synth device. This is exactly what DSMI does. Porting the iPhone libdsmi to Android is what you need to do, its only 2 source files and 2 headers. 1 handles the MIDI message format which is very simple and can pretty much just be converted line by line to java and the other handles the network connection to the DSMI server which will need to be rewritten to use Android semantics for creating network connections.