I'm trying to stream audio across several devices using the Nearby Connections API but I'm not really sure if this is really possible/recommendable.
What I want to do is broadcast the audio files (both songs stored on the phone and from apps such as Google Music, Spotify ...) to the other devices connected and so they can start playing the songs while the receive all the data chunks of the songs.
I think with the Nearby Connections API we can only send 4KB payload chunks when we call Nearby.Connections.sendReliableMessage() so what I'm doing so far is call that function as many times as required sending 4KB chunks each time until I manage to deliver the entire file.For the onMessageReceived () Listener what I do is to store all the chunks that I receive in a byte array so once all the chunks have been transferred I can play back the song from the byte array file.
With the approach I'm taking I guess I'd be able to reproduce the song once I've transferred it on its totality, but I'd like to reproduce the songs while I'm actually receiving the data chunks, and in a synchronized manner with all the devices.
Does this makes sense to you guys? Is it the right approach? Is there any other more effective way of doing this? (I already know about the option of streaming audio using Wifi-Direct, but I'd like to use Nearby)
The guy in this tutorial had a similar problem with chunks of audio.
He shows how to play the song, while the bytes are still downloaded and the audio file is built.
Maybe you can utilize the "Incremental Media Download"-part of the tutorial.
Quote:
This is where the magic happens as we download media content from the the url stream until we have enough content buffered to start the MediaPlayer. We then let the MediaPlayer play in the background while we download the remaining audio. If the MediaPlayer reaches the end of the buffered audio, then we transfer any newly downloaded audio to the MediaPlayer and let it start playing again.
Things get a little tricky here because:
(a) The MediaPlayer seems to lock the file so we can’t simply append our content to the existing file.
...
If this doesn't work I would just use the nearby-connection to exchange IP Adresses and go for a Wifi-Direct solution.
I hope this helps and I'd love to hear what your final solution looks like!
I implemented this several years ago for live audio/video packets sent in a serial stream to an Android 4.0 device. It would work the same for audio (or video) packets being streamed over the Nearby connections API.
The solution was to run a http streaming server from within the Android App, then consume this using the Android media player API with its http streaming capabilities (or you can embed ExoPlayer in your app if you prefer as it also supports http streaming).
This was achieved by piping the data stream directly into a FFSERVER process running on the device. The Android NDK was used to create and manage the named pipe required as input into FFSERVER.
As this was done a few years ago I have not tested this on versions of Android 4.1+. Anyone who does this will need to adhere to the FFmpeg GPL/LGPL license when building and distributing FFSERVER.
Related
I'm trying to develop an Android app that reads a video coming from either the server or other peers that have the video. For this usecase I have to split my videos into smaller pieces to optimize transfer times and each piece can be provided by either the central server or another peer.
I would like to know if exoplayer is able to read a sequence of video pieces without interruption ?
I am free to do whatever I want in the splitting process e.g. split the video with linux command split.
Most adaptive bit rate streaming formats work something like your description - they spit the video into multiple chunks and the video player requests them one at a time. Examples of adaptive rate streaming protocols are HLS, MPEG-DASH, SmoothStreaming.
It is possible to have the url for the next 'chunk' of video route to a 'central' server which could proxy the request to another 'peer', if this would meet your needs.
Its worth noting that many videos are delivered via CDN's which might interfere with your desired approach (or alternatively might actually match what you want, depending on what your underlying requirements are) so you may want to check this also.
Update
Assuming you mean that some chunks will come from the server and some chunks from peer devices on the network, then the above server proxy method would not work.
One way you could do this would be to have all the chunks delivered to the device from whatever source is best for each chunk, and then put them together as you require on the device and stream the result from 'localhost' on your device to the player.
This sounds like a huge amount of overhead and something that would never work but I believe it is actually a technique used in some apps to convert from one streaming format to another (can't provide example - sorry...).
One example of a 'localhost' server on Android that might be useful to look at is:
http://www.laptopmag.com/articles/android-web-server
An alternative, if you were to use HTML5 inside a web page on the device you could use the Media Source Extension mechanism to load the video chunks from the different sources before passing them to the player. This does require Chrome at this point rather than the standard Android browser as the latter does not support the MSE extension at the time of writing.
In all these approaches you obviously need to make sure you load enough in advance to keep the video pipeline and buffer full, to avoid pauses.
I have never worked on video related project, however we have to now.
1. What we tried to do
Build an Andriod application which can take the real time steam of video an audio.
Send the captured stream to Server
Other clients(Either Android client or iOS or HTML5) can view these streams
All of the above three steps should work at the same time.
Video streamed to server should be cached by future play.
2. What I know at the moment
I have searched at google and sf to see if someone have the same requirement.
After that I know a little about the video transformation:
Protocol:
RTSP/RTP/RTCP
RTSP: control the state of the transformation like PLAY,PAUSE,STOP..
RTP: which do the real transport job
RTCP: work in conjunction with RTP(synchronize the stream)
HTTP:
1) Download the small pieces of the video file and play them, use the `range-requset` to control the download(play) location.
2) HLS by Apple. Even it said it is live stream, it is based on `.m3u8` file, by updating the index of which to do the live job.
RTMP by Adobe.
Encoding:
Nothing I know yet.
And it seems that RTSP/RTP/RTCP can be used for both uploading to server and playing at the client. So it apply for the application which need high real-timing. However since the RTSP/RTP/RTCP based on TCP/UDP so getting through the Router would be a problem.
While the HTTP can only be used for playing at the client(Technologically you can stream the small pieces of file by HTTP, but I think it is not a good idea), so it can be use to play the existed video stream either from file or something else. And you don't worry about the Router, which means it can be used under complex network environment.
For our application, since we do not have a strict requirement for the real-timing during the playing. So we tried to stream the video source from the Android client to Server by RTSP/RTP/RTCP, and serve these streams by HTTP.
2. Questions:
Anything wrong in all of the above?
Is it posssible of my idea:streaming by RTSP/RTP/RTCP and serving
by HTTP.
If yes, it seems that the Server shoud do something to cache the video to a proper format for further serving. I am not sure if this job can be done by a Video Server out of the box, or by myself?
What should I know more about the streaming development(at least for
my current project)? Any tutorial are welcome.
We have to capture the real-time video using Android Camera, and send them to the server, then other users would read them through the browser or something else.
I have Googled and searched at SO, and there are some examples about video stream app like:
1 Android-eye: https://github.com/Teaonly/android-eye
2 Spydroid-ipcamera:https://code.google.com/p/spydroid-ipcamera/
However it seems that they have different environments, most of the apps will start an HTTP server for stream requests, then the client will visit the page through the local network and see the video.
Then the video stream source and the server are both the device like this:
But we need the internet support like this:
So I wonder if there are any alternative ideas.
I can see you have designed the three stages correctly, in your second diagram.
So what you need is to determine how to choose among these protocols and how to interface them.
No one can give you a complete solution but having completed an enterprise project on Android Video Streaming I will try to straighten your sight towards your goal.
There are three parts in your picture, I'll elaborate from left to right:
1. Android Streamer Device
Based on my experience, I can say Android does well sending Camera streams over RTP, due to native support, while converting your video to FLV gives you headache. (In many cases, e.g. if later you want to deliver the stream on to the Android devices.)
So I would suggest building up on something like spyDroid.
2. Streaming Server
There are tools like Wowza Server which can get a source stream
and put it on the output of the server for other clients. I guess
VLC can do this too, via File-->Stream menu, an then putting the
RTSP video stream address from your spyDroid based app. But I have
not tried it personally.
Also it is not a hard work to implement your own streamer server.
I'll give you an example:
For Implementation of an HLS server, you just need three things:
Video files, segmented into 10 second MPEG2 chunks. (i.e. .ts files)
An m3U8 playlist of the chunks.
A Web Server with a simple WebService that deliver the playlist to the Clients (PC, Android, iPhone, mostly every device) over HTTP. The clients will then look up the playlist file and ask for the appropriate chunks on their according timing. Because nearly all players have built-in HLS support.
3. The Client-Side
Based on our comments, I suggest you might want to dig deeper into Android Video Streaming.
To complete a project this big, you need much more research. For example you should be able to distinguish RTP from RTSP and understand how they are related to each other.
Read my answer here to get a sense of state-of-the-art Video Streaming and please feel free to ask for more.
Hope you got the big picture of the journey ahead,
Good Luck and Have Fun
Quite a general question, but I will try to give you a direction for research:
First of all you will need answer several questions:
1) What is the nature and purpose of a video stream? Is it security application, where details in stills are vital (then you will have to use something like MJPEG codec) or it will be viewed only in motion?
2) Are stream source, server and clients on the same network, so that RTSP might be used for more exact timing, or WAN will be involved and something more stable like HTTP should be used?
3) What is the number of simultaneous output connection? In other words, is it worth to pay for something like Wowza with transcoding add-on (and maybe nDVR too) or Flussonic, or simple solution like ffserver will suffice?
To cut long story short, for a cheap and dirty solution for couple of viewers, you may use something like IP Webcam -> ffserver -> VLC for Android and avoid writing your own software.
You can handle it this way:
Prepare the camera preview in the way described here. The Camera object has a setPreviewCallback method in which you register the preview callback. This callback provides data buffer (byte array) in YUV format that you can stream to your server.
I'm working on a radio Android app in which I'd like to have options to rewind/fast-forward/back to live the audio stream.
It seems that it's not possible with Mediaplayer (I can't find any method to do that), so how can I do that?
The developer of the iOS version of the app is using the RadioKit SDK. Is there anything similar for Android?
I found this link that goes over some of the reasons why HTTP streaming isn't well-supported on Android. You can write your own HTTP streaming client and insert it as a proxy between the MediaPlayer and the media source, but that is the only way as far as I am aware. As far as trick mode, there is no real fast-forward or rewind protocol built into HTTP streaming. You have to simply request the correct byte from the server (see here for a little more info). The good news is it should be much easier to estimate the byte to request given a time position for audio than video (I've seen some pretty ridiculous algorithms for video).
I have some design questions that I want to discuss with people interested in helping me. I am planning to develop a simple VoIP program that allows two Android phones in the same network to use VoIP. My goal is simply to capture sound, send the data with UDP, receive UDP data and play sound.
My current design is to have 2 threads: one captures the microphone and sends the data; the other one receives bytes and plays them.
I was starting to implement that using MediaPlayer and MediaRecorder. The issue that came up is how do I record and play sound? By that, I would like to know if I need to use a file, although that seems slow, or if there is anyway to have the recording automatically sent to my UDP socket please?
Basically, I wonder if I have to record to a file, then to be able to play it, or if I could just pass a socket (for recording and playing).
Does anyone has any suggestion please?
Thank you very much
MediaRecorder needs an FD so, you can use sockets as well. I dont see any issues with that. It all depends on how you would design your system.
Don't use those classes for streaming audio - use AudioTrack and AudioRecord instead.
They provide the functionality you need for playing and recording raw audio data, without dealing with an FD.
When you record a frame (either byte[] or short[]), wrap it with a UDP packet.
When you receive a UDP packet, unpack the relevant byte[] or short[] and play it.