i wanted to play music and video file from ftpserver,i don't like to download it and after that play it, i just play without download like url address use in MediaPlayer class.
mediaPlayer = new MediaPlayer();
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mediaPlayer.setDataSource("ftp://ip");
mediaPlayer.prepare();
mediaPlayer.start();
So what you are talking about is streaming, streaming involves the data being sent to you as you are listening to it. It comes to your device a little bit at a time, then destroys itself basically to not take up storage on the device that is playing it, this uses data if you are on a limited data plan you will use up your data fast, also if you plan to listen to it more than once it doesn't make sense not to have a local copy, why waste the data twice. If you are still interested in streaming from a server, FTP isn't how you would accomplish this, FTP is used for transferring files to the server mainly, while you can download from the server over FTP you wouldn't want to stream this way. If you are looking to setup a home media streaming server that you can access from any device so you only have to upkeep the content in one location I would recommend checking out this article. Hope this helps!
Related
How would I go about streaming audio from one device to another over the internet? I'm aware of sending basic data using Java sockets, but wondering how to:
Start streaming midway through a file (say, during the middle of a song)
What format is needed for the data being sent. MediaPlayer can take a url as a data source, so how should the audio be represented when being sent from the server side?
Thanks
Having implemented a music streaming app, I can share a little with you.
If you want to stream and use the Android MediaPlayer class, MP3 or OGG is your best bet for a format.
If your architecture is client-server, i.e. real server in the Internet serving streams to Android devices, then just stream MP3 or OGG bytes over HTTP. Just point MediaPlayer to a URL on on your server.
If your architecture is peer-to-peer with your own custom socket code, you can create a "proxy http" server that listens on localhost on a dedicated thread. You point your MediaPlayer instance to your local in-process socket server (e.g. http://localhost:54321/MyStream.mp3). Then you have to implement code to parse the HTTP get request form MediaPlayer, then proxy the stream bytes between your custom P2P socket protocol and listeners connected to your local http server. A lot of radio streaming apps do exactly this so as to parse the ICECAST metadata from the MP3 stream. Here's the code I use for my radio streaming app that does this.
For the "start midway through the file" scenario, you might find my MP3 Stream Reader class useful. It wraps an InputStream (file, socket stream, etc..) and syncs to the next valid frame from where ever you started from. Just call read_next_chunk to get the next block of audio and its format. MediaPlayer might do most of this heavy lifting for you, so this might not be needed.
I cant find anything online but how can i use a chrome tab web audio api in an android app so i can play sound during a phone call.
i went to this site but when i play the sound during a phone call the far end doens't here anything. I thought one feature of web audio was that it can play change the sound of someones voice in a phone call, so i thought it had access to the audio phone call stream.
even here the tech says its ready for android but i cant even get hte audio recorder demo to work on android.
While you do (with the user permission) have access to the input of the device you only have access to the main output of the device (internal speakers or headphones). This is represented as the AudioContext.destination. The buffers in a call is (probably) a different output that you simply don't have access to in Web Audio (and that's probably a good thing. Imagine the security issues we'd have if apps were allowed to hijack calls!).
Please help me getting started
how do i achieve live streaming of video on one device on another using Wifi Direct.
I have already established connection between the two devices.What is the next step to follow
Take a look at WIFI-Direct documentation
First you need to establish a connection. there is enough code in the documentation to do that.
Then, then you send the file as a stream and receive it on the other side. Now, based on Android MediaPlayer Documentation you need to send the received stream to the mediaplayer. There are different ways to approach this like saving the stream to a file and then passing the file to the mediaplayer. But a better way is implementing a localhttpserver and passing the local uri to the mediaplayer.
I gathered this info from the links below:
Write to file and stream from it
Modifying FileInputStream for mediaPlayer setDataSource
Android ServerSocket programming with jCIFS streaming files
Create mediaplayer with inputstream in android
Sample Local HTTP server
I am curious to know if there is way that I can make my android device as a RTSP server and stream video with any other acceptable device.
I have no understanding of RTSP server and other protocol that should be followed to make this happen.
Any help is appreciated.
Thanks,
SKU
There is nothing to worry about this problem. I seen a solution here
http://techsplurge.com/5080/get-vlc-media-player-for-android-with-unofficial/ take read and get your answer.
AFAIK, the architecture you should adopt is to put in place a remote machine.
On this machine, you must have a server installed : Flash Media Server, Red5, etc. It will host the application that will get the coming stream and will outputs it.
Then, you can stream from a device (that you called "server") to the remote hosted application (ex: www.remoteserver.com:1234/myApp). To stream, just put that url in the input of a media container (http://developer.android.com/reference/android/widget/VideoView.html). Hopefully, Android natively supports the RTSP protocol via that VideoView container.
Hope it helps.
I am trying to implement something similar. One suggestion I have received was, when you are making your device as RTSP server and someone wants to stream video from your device, its called peer to peer streaming. In this case, its better to involve a server in between, which acts like encoder-decoder. As in when another client sends RTSP SETUP request, it comes to your device for sure. But now when you are ready to stream video, send payload (using RTP) to server and server intern sends it to requesting device. You can avoid firewall problems possible in LAN or routers (in case of wife network).
I'm developing an AIR for Android application, and am current sending audio to fms servers via standard NetStream/Microphone options. I (ignorantly) assumed that attaching a bluetooth device would be pretty simple, and connecting it would make it show up as a native "Microphone". Unfortunately, it does not.
I don't think it is even possible to use Netstream.publish and publish raw bytes, so the only hope is that there's a way to use NativeProcess + Java to create a native microphone "handle" that AIR can pick up on.
Has anyone run into this issue?
I think one possible solution would be using NetConnection.send() instead of Netstream.publish().
You should get sound data from your BT microphone. I am not sure if you can get using AIR. You may need to use an android service that gets the sound data and feeds your AIR app via a file, a UDP port or an invoke etc.
When you get some sound data, encode it so flash can play it (Speex, Nellymoiser, etc) You can do the encoding in your Android service as well.
Whenever your AIR app receives a sound data, send it to your streaming server via NetConnection.Send().
Extend your streaming server to process sound data received. You can embed it into a flv stream, or send to other flash clients if it is a chat app.
Other than that, I can't find a way to have a "microphone handle" for your BT microphone. I once thought of creating a virtual device on Android, but I couldn't find any solution.