Please help me getting started
how do i achieve live streaming of video on one device on another using Wifi Direct.
I have already established connection between the two devices.What is the next step to follow
Take a look at WIFI-Direct documentation
First you need to establish a connection. there is enough code in the documentation to do that.
Then, then you send the file as a stream and receive it on the other side. Now, based on Android MediaPlayer Documentation you need to send the received stream to the mediaplayer. There are different ways to approach this like saving the stream to a file and then passing the file to the mediaplayer. But a better way is implementing a localhttpserver and passing the local uri to the mediaplayer.
I gathered this info from the links below:
Write to file and stream from it
Modifying FileInputStream for mediaPlayer setDataSource
Android ServerSocket programming with jCIFS streaming files
Create mediaplayer with inputstream in android
Sample Local HTTP server
Related
How would I go about streaming audio from one device to another over the internet? I'm aware of sending basic data using Java sockets, but wondering how to:
Start streaming midway through a file (say, during the middle of a song)
What format is needed for the data being sent. MediaPlayer can take a url as a data source, so how should the audio be represented when being sent from the server side?
Thanks
Having implemented a music streaming app, I can share a little with you.
If you want to stream and use the Android MediaPlayer class, MP3 or OGG is your best bet for a format.
If your architecture is client-server, i.e. real server in the Internet serving streams to Android devices, then just stream MP3 or OGG bytes over HTTP. Just point MediaPlayer to a URL on on your server.
If your architecture is peer-to-peer with your own custom socket code, you can create a "proxy http" server that listens on localhost on a dedicated thread. You point your MediaPlayer instance to your local in-process socket server (e.g. http://localhost:54321/MyStream.mp3). Then you have to implement code to parse the HTTP get request form MediaPlayer, then proxy the stream bytes between your custom P2P socket protocol and listeners connected to your local http server. A lot of radio streaming apps do exactly this so as to parse the ICECAST metadata from the MP3 stream. Here's the code I use for my radio streaming app that does this.
For the "start midway through the file" scenario, you might find my MP3 Stream Reader class useful. It wraps an InputStream (file, socket stream, etc..) and syncs to the next valid frame from where ever you started from. Just call read_next_chunk to get the next block of audio and its format. MediaPlayer might do most of this heavy lifting for you, so this might not be needed.
i wanted to play music and video file from ftpserver,i don't like to download it and after that play it, i just play without download like url address use in MediaPlayer class.
mediaPlayer = new MediaPlayer();
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mediaPlayer.setDataSource("ftp://ip");
mediaPlayer.prepare();
mediaPlayer.start();
So what you are talking about is streaming, streaming involves the data being sent to you as you are listening to it. It comes to your device a little bit at a time, then destroys itself basically to not take up storage on the device that is playing it, this uses data if you are on a limited data plan you will use up your data fast, also if you plan to listen to it more than once it doesn't make sense not to have a local copy, why waste the data twice. If you are still interested in streaming from a server, FTP isn't how you would accomplish this, FTP is used for transferring files to the server mainly, while you can download from the server over FTP you wouldn't want to stream this way. If you are looking to setup a home media streaming server that you can access from any device so you only have to upkeep the content in one location I would recommend checking out this article. Hope this helps!
I'm working on a project where I stream video over wifi-direct from one android device to another using rtp over udp.
Essentially, the first android device hosts a rtsp server and listens for connections. Once the client android device connects to it (over wi-fi direct) and starts listening for packets, the first device begins to stream the video content.
I'm aware that RTP packet headers have a 32-bit timestamp at bit offset 32 - 64. But I do not know how to access the contents of the packet and subsequently access just that segment of their header.
Currently, I am using libvlc to play the streamed video on the device. But I would like to be able to measure the latency between the two devices. Either by extracting the timestamps from the packets on arrival or by some other way (maybe VLC can help?)
Edit: I am going to try to learn from the code posted here, but still await any replies.
Edit2: So I'm trying to go simpler. I've instead made a client activity to connect to the remote host and read packets that way using android's DatagramSocket api. However, my server activity doesn't start serving even after the client activity says it is connected. Not sure what needs to be done to let the server know there is a client ready to be served. MediaPlayer and VLC apis both were able to start streaming video once they connected. What am I missing? Do I need to do more than DatagramSocket.connect(ipaddress, port)?
I am curious to know if there is way that I can make my android device as a RTSP server and stream video with any other acceptable device.
I have no understanding of RTSP server and other protocol that should be followed to make this happen.
Any help is appreciated.
Thanks,
SKU
There is nothing to worry about this problem. I seen a solution here
http://techsplurge.com/5080/get-vlc-media-player-for-android-with-unofficial/ take read and get your answer.
AFAIK, the architecture you should adopt is to put in place a remote machine.
On this machine, you must have a server installed : Flash Media Server, Red5, etc. It will host the application that will get the coming stream and will outputs it.
Then, you can stream from a device (that you called "server") to the remote hosted application (ex: www.remoteserver.com:1234/myApp). To stream, just put that url in the input of a media container (http://developer.android.com/reference/android/widget/VideoView.html). Hopefully, Android natively supports the RTSP protocol via that VideoView container.
Hope it helps.
I am trying to implement something similar. One suggestion I have received was, when you are making your device as RTSP server and someone wants to stream video from your device, its called peer to peer streaming. In this case, its better to involve a server in between, which acts like encoder-decoder. As in when another client sends RTSP SETUP request, it comes to your device for sure. But now when you are ready to stream video, send payload (using RTP) to server and server intern sends it to requesting device. You can avoid firewall problems possible in LAN or routers (in case of wife network).
I am new to socket programing. I have to make an app for android which should stream video to the web. I have read about this on various website and all say to use socket.
Is socket required to do this? I mean is there any other way of doing this without socket?
By Google, I found various methods some of which uses socket and some other not. There are also various protocols available which is specially meant for streaming purpose such as rtpm, rtsp, etc. One can also use http to do the same.