how to stream audio from server to an android app - android

I own a website with bunch of mp3 files - here
I would like to create a radio (audio stream) on the server and an android app that will play the stream on many android devices.
Can anyone help me with the architecture (server side and android side)? Are there any ready-made open source solutions that can facilitate the development/implementation?
my website runs on the LAMP stack (Debian)
I would like the users of the android app to be able to influence the playlist - I would like to display about 3-5 randomly chosen songs in the android app and give the app users the ability to vote which song out of those 3-5 random ones will play next (poll/voting)

You require two things:
i) Streaming server which is not difficult to search.
ii) Player at Android app capable of playing Streaming audio and/or video if applicable.
VideoLAN (VLC) player is most used for this purpose. You can import their SDK and use as per your requirement.
See VLC Git wiki page for SDK information.

Related

How does Playit app prevent videos from playing in other media players? What technology does it use

There is a website pdisk.net and whatever the video we upload from our computer to the site and share the link to view it, the videos open only in Playit android app. Only first 15 seconds can be viewed after that a screen appears showing "to play the video, install Playit app from playstore". What is happening to the videos in the backend when we upload on pdisk.net website? I think the site is owned by Playit app only. I noticed that the uploaded videos use final url a6.hentai.com...etc to stream the videos which can be streamed fully using the app only.
Can someone tell if the videos are encoded or encrypted backend and the app is made to decrypt the videos? Is such thing possible?
'''no code req.'''
In their FAQ page, they have mentioned that:
Video downloaded by Apps uses Smart Muxer technology.Smart Muxer is a
unique technology developed by PLAYit, can merge the video and audio
within seconds without any extra recoding and storage. It’s really
workable when there are some videos have no build-in audio and need to
be merged in the devices with low configurations. Due to the unique
technology, the video can be only played by PLAYit and the other
main-stream players can’t support. And videos shared to social apps
can also be opened in PLAYit.
I found one of the discussion in reddit, as mentioned by one of the users:
They encrypt the normal mp4 video in some kind of way which enables
them to limit the playback to their app.
As for documentation, there is not much available online. But found this feature request in VLC forum.

How to buffer and play video in a Android and iOS like Netflix and Iflix

I have a requirement to develop a Android and iOS mobile apps that allow subscribers to view movies like the way Netflix and Iflix does it.
I would like to know if this can be achieved by inbuilt Video playing classes or widgets on the Android and iOS platforms, or if we will need a library or SDK for this.
I came across this URL on how to stream video in Android apps. Would this approach suffice for this requirement?
https://code.tutsplus.com/tutorials/streaming-video-in-android-apps--cms-19888
Netflix and similar systems use ABR to deliver video to mobile devices - ABR allows the client device or player download the video in chunks, e.g 10 second chunks, and select the next chunk from the bit rate most appropriate to the current network conditions. See here for an example:
https://stackoverflow.com/a/42365034/334402
There are several ABR protocols but the two most common at this time are HLS and DASH. HLS must be used to deliver video streams to iOS devices due to the apple guidelines (at this time and for video over 10 mins which may be accessed on a mobile network - the guidelines can change over time) and DASH is probably more common on Android devices, although HLS can be supported on Android also.
Most Android players now can handle ABR - the Android Exoplayer is a good example, is very well used and supports this natively:
https://github.com/google/ExoPlayer
Take a look at the Developers Guide (included in the link above at the time of writing) which shows how to include ExoPlayer in your app.
On iOS the native player supports ABR using HLS.

I would like to understand how to play the Live Stream / VOD on native Android App

We want to create a Live-Broadcasting/streaming platform where anyone can go live on click of a button using his/her mobile camera. The same live feed should be viewable from the native mobile App. To start with, we would like to support live broadcast and viewing live feed from both IOS and Android platforms. We are using Wowza as the media streaming server
UseCase : Lets say sitting at home, I want to show my new home to all my friends. I download the mobile app on Android and start live stream on click of the button. My friends, who have also downloaded the same mobile app, can see my live-stream through their mobile. They can also see some of the VOD Content.
I would like to understand how to play the Live Stream / VOD on native Android App ?
Thanks in Advance :)
OpenMAX for Android will give you the best flexibility & control, however, it's a low level API that mandate C++ & NDK usage, you can also use ffmpeg static libs in the same manner

Sending live video stream to wowza streaming engine with Android devices

I want to send live video stream from my android device to wowza streaming engine. I am using sample in this blog but I can not see the result on Test Players page.
Do I need to have a web server serving a page with a video player pointed to this video/app on wowza?
I found this little (but very useful) library with three examples: libstreaming
It works like a charm! Easy to install and develop.
Main point is to look at Wowza logs to understand if stream was successfully published or not.
Then, according to logs you will know what Application, Application instance and Stream name are used for publishing.
So you'll be able to set up any player (VLC for example) with those values and look if stream is viewable or not.
Accepted answer is ok. Libstreaming is working (kinda) but it did not fulfill my expectations so that it can be pushed to some production app. Since this question is quite old, i will share mine up-to-date solution (AS 2.1.2 - Marshmallow) which is using JavaCV. I've built boilerplate for android so it can be used in no time.
Here is url:
How to stream live video from android to Wowza via RTMP

need some advices when making a streaming application

I'm developing an app on android that can play videos streamed from a server (my PC) likes youtube. I'm new to this and I need some advices to get a right approach.
First, my app on android should have the ability to seek to a position that have been played. So, in the server side (my PC), what streaming server should I use?
Do I need to combine the streaming packets into a temporary file and play on this file in order to get seek-back?

Categories

Resources