RTSP Stack implementation - android

I am working on making RTSP url to work on Android.
I have used MediaPlayer and VideoView to play the RTSP url but both of them fail to play in G1 Device and emulator. Error is PVMFFailureiin PLAYER_INIT.
So I thought of implementing a RTSP client from scratch. I have developed a small application, it exchanges all the messages (DESCRIBE, SETUP, PLAY, TEARDOWN). But the problem is after PLAY, I should receive the RTP packets at client_port as mentioned in Transport header but TCP dump shows the following messages. I think UDP packets are coming but not received at the application.
11:38:50.213394 IP ew-in-f177.google.com.6970 > 192.168.1.2.6970: UDP, length 444
11:38:50.213451 IP 192.168.1.2 > ew-in-f177.google.com: ICMP 192.168.1.2 udp port 6970 unreachable, length 36
Please let me know how to solve this issue.

OpenCORE, the multimedia engine in Android, is rather stringent about sticking to various standards. Some things that may work in the wild (e.g., MP3 over RTSP) are not supported because they are not in the respective spec. Leastways, that's what we have been told.
Hence, it may be that the media you are trying to stream is slightly out of spec, or the server is slightly out of spec. You may wish to try to find some RTSP stream known to work on Android, get your player working right using it, then focus on getting it to consume your own streams.

Some handsets (I don't know whether Android falls in this category or not) include a firewall on the device. In order to receive packets you may have to perform a hole punch (i.e. send a dummy packet to the server. This will create an allow rule on a firewall running on the local device. It will also create the appropriate NAT mapping if necessary as well.)
Is the device on WIFI or on the carrier's network (which carrier)?

Related

Server software for handling multiple mp3/ogg streams

I'm looking for advice regarding an aspect of a project I'm working on.
I'm developing a demo android app for a not-for-profit that specialises in services for the vision impaired. The plan is, among other things, the app will enable users to stream this organization's specialised audiobooks.
For the sake of demo'ing/development I need to establish some sort of server which will, pending a request from a device running the app:
directly transfer certain xml/html index files to the android phone (no streaming necessary)
stream .ogg and .mp3 audio files to the device
serve more than one client device at a time
start a stream from a specific point within an mp3/ogg file, pending a request from the phone app
I've had a look at Icecast as an mp3/ogg streaming solution, but my knowledge of servers is a bit limited (I've only ever done some basic work in Flask). Would I need to run this in tandem with something that can generically serve files / handle requests?
I'm basically just looking for a good solution/tool to implement this is. The server side doesn't need to be completely fleshed out, just fit the bill above, as my focus is developing the phone app side for now. For the sake of a demo, something straightforward / well documented would suit best.
You don't need a special server for this. Any HTTP server that supports range requests will be fine. This includes Nginx, Apache, etc. There is no need to spoon-feed clients media data from the server. This streaming and buffering aspect is handled automatically on the client, through TCP window size and outright closing the connection and connecting again later as-needed.
Icecast is meant for radio-style streams where everyone listening hears the same thing at roughly the same time. Since that's not an aspect you want, stick to any normal HTTP server.

Real-time JPG-frame streaming through a network

I am developing an Android application with Unity3D (C#) that captures frames from the camera (~30fps) and sends them to a computer on a network. The frames received by the computer will then be processed with an OpenCV-based program (C++). I managed to implement this using UDP socket (also tried TCP but some frames get lost sometimes). Later, I found out that some networks stop UDP packets for security reasons (like inside my company), so I would like to generalise the communication by creating a different interface, e.g. via HTTP POST. Will this make sense? I don't have much experience with HTTP requests and I was wandering whether this approach will be similar the TCP-socket case, that wasn't successful.
Are there other communication means that can guarantee performance like UDP, but by making the communication at a higher level?
There are two types of socket protocols... TCP and UDP. HTTP is an application protocol that is run on the data received over a TCP socket. So you cannot replace TCP or UDP with HTTP. You will use HTTP with a TCP or UDP socket.
Now there is something troubling with your post... TCP sockets are by far the more reliable. It does not make sense that frames get lost when using TCP, since TCP essentially guarantees that you will get all of your data out.
With UDP, on the other hand, you will often find that many packets get dropped. Actually, for video streaming that is not a totally bad thing. For video streaming, you want to reduce the latency as much as possible. TCP has to do a lot of error checking to ensure that your packets don't get dropped, and so TCP is much slower than UDP. Hence for video streaming, you will typically use UDP.
In any event, there are much more efficient libraries for streaming video than simply trying to implement your own with UDP and jpeg frames. I would recommend you search for one that meets your platform and language needs.

Android Live video steaming from one device to another

I am building an android application which will stream video calling from one device to another android device for that i am using wowza video streaming API(Media engine) With that i am able to steam video from android app to web but is it possible device to device video steaming?
If you are planning to develop all the infrastructure , then these are the points to be evaluated and concluded.
What Technology is used
WebRTC is the technology used to support video calling. WebRTC is a free, open project that provides browsers and mobile applications with Real-Time Communications (RTC) capabilities via APIs. Check out WebRTC Details here It was introduced by Google in 2010. This allows real time communication between two browsers / mobiles.
Concepts involved
1. Data streams and Hardware
WebRTC helps into setting up/identifying hardware’s and identifying network with STUN server ( What is STUN server ) along with hardware ( microphone/camera and speakers) . For mobiles this comes as inbuilt hardwares
2. Audio Video CODECS
Google has made audio/video required for these features as open source. Generally Audio G711 for phones (still varies in specific cases) . And for Video VP8 and VP9
3. Peer Discovery
For making a call , generally either address is required. Now in internet most IP’s are dynamic. To solve this , server needs to keep track of who is online. This can be done using XMPP, SIP or some custom protocols. So for anyone to receive a call, the caller should check with server or the other way around
4. STUN Server
Once signalling (peer discovery) is done, then STUN server is required. This server will faciliate to determine external IP address as well as info whether two or more devices can talk to each other or not
5. TURN Server
If a peer-to-peer session is not possible , then a TURN server is required. The TURN server will basically shift the bits for you through open holes in the firewall between the two clients. This happens due to asymmetric firewalls and the possibility to punch holes on different ports in firewalls
Or else you can use providers like SINCH who already handles and configures the basic requirement and you only need to concentrate on mobile front end.
Check out SINCH ANDROID SAMPLE as well

How to extract rtp packet timestamp on android

I'm working on a project where I stream video over wifi-direct from one android device to another using rtp over udp.
Essentially, the first android device hosts a rtsp server and listens for connections. Once the client android device connects to it (over wi-fi direct) and starts listening for packets, the first device begins to stream the video content.
I'm aware that RTP packet headers have a 32-bit timestamp at bit offset 32 - 64. But I do not know how to access the contents of the packet and subsequently access just that segment of their header.
Currently, I am using libvlc to play the streamed video on the device. But I would like to be able to measure the latency between the two devices. Either by extracting the timestamps from the packets on arrival or by some other way (maybe VLC can help?)
Edit: I am going to try to learn from the code posted here, but still await any replies.
Edit2: So I'm trying to go simpler. I've instead made a client activity to connect to the remote host and read packets that way using android's DatagramSocket api. However, my server activity doesn't start serving even after the client activity says it is connected. Not sure what needs to be done to let the server know there is a client ready to be served. MediaPlayer and VLC apis both were able to start streaming video once they connected. What am I missing? Do I need to do more than DatagramSocket.connect(ipaddress, port)?

How to make Android device as RTSP server

I am curious to know if there is way that I can make my android device as a RTSP server and stream video with any other acceptable device.
I have no understanding of RTSP server and other protocol that should be followed to make this happen.
Any help is appreciated.
Thanks,
SKU
There is nothing to worry about this problem. I seen a solution here
http://techsplurge.com/5080/get-vlc-media-player-for-android-with-unofficial/ take read and get your answer.
AFAIK, the architecture you should adopt is to put in place a remote machine.
On this machine, you must have a server installed : Flash Media Server, Red5, etc. It will host the application that will get the coming stream and will outputs it.
Then, you can stream from a device (that you called "server") to the remote hosted application (ex: www.remoteserver.com:1234/myApp). To stream, just put that url in the input of a media container (http://developer.android.com/reference/android/widget/VideoView.html). Hopefully, Android natively supports the RTSP protocol via that VideoView container.
Hope it helps.
I am trying to implement something similar. One suggestion I have received was, when you are making your device as RTSP server and someone wants to stream video from your device, its called peer to peer streaming. In this case, its better to involve a server in between, which acts like encoder-decoder. As in when another client sends RTSP SETUP request, it comes to your device for sure. But now when you are ready to stream video, send payload (using RTP) to server and server intern sends it to requesting device. You can avoid firewall problems possible in LAN or routers (in case of wife network).

Categories

Resources