How to make better videos server? - android

I have an app, where i have to play to lots of videos,
when i tried to load videos from a dedicated server to my apps android and iOS, it takes too much time,
i upload videos on some cloud services, but the result is same,
i uploaded the video on video hosting service (Teachable, Thinkific), when the video is played on player provided by host, it plays better, but when i tried to load the video in app, it takes too much time.
i think i am missing something on server end.
Now, i need suggesstions,
What kind of service i have to use?
when a video is played on youtube, it does not include .flv or any other formate extension in address, so i think there is any logic to load videos more properly, please guide me, if you have any idea, like what protocol or logic is being used in better video playing

for making video applications, there are some points to be considered first.
(usually, videos uploaded to youtube etc are played in chunks, m3u8 format for example)
Videos must be hosted in CDN network. (use AWS like service)
Videos must be in HLS, MSS (AWS also provide this service)
App player must know about playing HLS protocol videos, (AVPLayer in iOS)
hope this answer gives you hint about your usecase

Related

WebView getVideo/Audio tracks or MediaStream and send it to the server via WebRTC

I have to get video/audio tracks or if it's possible MediaStream object from the Android WebView which plays HLS stream ("m3m8").
I load HLS stream in the WebView using method loadUrl("...m3m8"). It's running without issues, but i can't figure out how to get live video and audio tracks from the HLS stream. I read a lot of articles and I was not able to find any solution. So my question is - Is it possible to get audio and video tracks from the running HLS stream on the WebView? I need to get the audio and video tracks because I should send them via PeerConnection(WebRTC) which accepts MediaStream or audio tracks and video tracks. Any ideas and examples will be helpful. Thank you in advance.
HLS works by defining a playlist (your .m3u8 file) which points to chunks of the video (say segment00000.ts, segment00001.ts, ...). It can contain different streams for different resolutions, but the idea (very simplified) is that your player can download a chunk, play it right away, and download the next one in the meantime.
A WebRTC video track takes RTP packets. I don't know exactly what the Android API exposes, but it feels like either you can pass it an m3u8 as a source (though that may be a bit weird for a generic API), or you can't, and you have to read the HLS stream yourself, extract RTP frames, and pass them to WebRTC.
For instance, gstreamer's hlsdemux seems to be able to read from an HLS source. From there you could extract RTP frames and feed them to your WebRTC track.
Probably that's a bit lower-level than you hoped, but if the Android API doesn't do it for you, you have to do it yourself (or find a library that does it).

How does Playit app prevent videos from playing in other media players? What technology does it use

There is a website pdisk.net and whatever the video we upload from our computer to the site and share the link to view it, the videos open only in Playit android app. Only first 15 seconds can be viewed after that a screen appears showing "to play the video, install Playit app from playstore". What is happening to the videos in the backend when we upload on pdisk.net website? I think the site is owned by Playit app only. I noticed that the uploaded videos use final url a6.hentai.com...etc to stream the videos which can be streamed fully using the app only.
Can someone tell if the videos are encoded or encrypted backend and the app is made to decrypt the videos? Is such thing possible?
'''no code req.'''
In their FAQ page, they have mentioned that:
Video downloaded by Apps uses Smart Muxer technology.Smart Muxer is a
unique technology developed by PLAYit, can merge the video and audio
within seconds without any extra recoding and storage. It’s really
workable when there are some videos have no build-in audio and need to
be merged in the devices with low configurations. Due to the unique
technology, the video can be only played by PLAYit and the other
main-stream players can’t support. And videos shared to social apps
can also be opened in PLAYit.
I found one of the discussion in reddit, as mentioned by one of the users:
They encrypt the normal mp4 video in some kind of way which enables
them to limit the playback to their app.
As for documentation, there is not much available online. But found this feature request in VLC forum.

buffer and play videos faster in VideoView

Using Picasso I was able to download and display my images very quickly in my Android app. Now i want to stream my videos from my S3 server and play them through my app faster than my code here:
try {
MediaController VideoController = new MediaController(VideoPlayerActivity.this);//Creates a media controller to this activity.
VideoController.setAnchorView(AdVideoView);//Adds the media controller to the video view.
Uri video = Uri.parse(VideoURL);//Creates a Uri to hold the URL of the video.
AdVideoView.setMediaController(VideoController);//Add the media controller to the video view.
AdVideoView.setVideoURI(video);//Make the video view play from the Uri.
} catch(Exception e) {
Log.e("Video Stream Error", e.getMessage());//Sets the message for the log.
e.printStackTrace();//Displays the error in the stack trace.
e.notify();
}
Is there a faster way to display videos through a GitHub or better code?
Thanks in advance!
The things that usually slow down streamed video playback are server and network related rather than client side - unless you have a very slow or very busy device it is unlikely it won't be able to play the video back at the rate it is received over the network.
Taking this and assuming you are are seeing delays in your streamed videos, there are a couple of common things to look for.
First, mp4 videos in normal format have the metadata at the end of the video file which is not good for streaming. There is a technique called quickstart, which moves the metadata to the start which you definitely want to use. More info here:
http://multimedia.cx/eggs/improving-qt-faststart/
Secondly, network connections can obviously vary and slow networks make streaming high quality video files a problem. A technique called adaptive bit rate streaming (ABR) allows the client request lower quality video 'chunks' if the network quality is bad and then change to higher quality when it improves.
ABR also helps startup time as it allows you quickly start the video stream by using a lower quality level, and hence smaller size chunk, and then increase the quality as the video progresses. You can see this effect when you start up most online video services, such as Netflix, today (July 2016).
One thing to note is that video hosting and streaming is a specialist area so it is generally easier to leverage existing streaming technologies and services rather than to build them your self. Some good places to look to get a feel for open source solutions:
https://gstreamer.freedesktop.org
http://www.videolan.org/vlc/streaming.html

How to get start for video streaming

I have never worked on video related project, however we have to now.
1. What we tried to do
Build an Andriod application which can take the real time steam of video an audio.
Send the captured stream to Server
Other clients(Either Android client or iOS or HTML5) can view these streams
All of the above three steps should work at the same time.
Video streamed to server should be cached by future play.
2. What I know at the moment
I have searched at google and sf to see if someone have the same requirement.
After that I know a little about the video transformation:
Protocol:
RTSP/RTP/RTCP
RTSP: control the state of the transformation like PLAY,PAUSE,STOP..
RTP: which do the real transport job
RTCP: work in conjunction with RTP(synchronize the stream)
HTTP:
1) Download the small pieces of the video file and play them, use the `range-requset` to control the download(play) location.
2) HLS by Apple. Even it said it is live stream, it is based on `.m3u8` file, by updating the index of which to do the live job.
RTMP by Adobe.
Encoding:
Nothing I know yet.
And it seems that RTSP/RTP/RTCP can be used for both uploading to server and playing at the client. So it apply for the application which need high real-timing. However since the RTSP/RTP/RTCP based on TCP/UDP so getting through the Router would be a problem.
While the HTTP can only be used for playing at the client(Technologically you can stream the small pieces of file by HTTP, but I think it is not a good idea), so it can be use to play the existed video stream either from file or something else. And you don't worry about the Router, which means it can be used under complex network environment.
For our application, since we do not have a strict requirement for the real-timing during the playing. So we tried to stream the video source from the Android client to Server by RTSP/RTP/RTCP, and serve these streams by HTTP.
2. Questions:
Anything wrong in all of the above?
Is it posssible of my idea:streaming by RTSP/RTP/RTCP and serving
by HTTP.
If yes, it seems that the Server shoud do something to cache the video to a proper format for further serving. I am not sure if this job can be done by a Video Server out of the box, or by myself?
What should I know more about the streaming development(at least for
my current project)? Any tutorial are welcome.

is MP4 a streaming protocol or file format?

I am currently using Wowza to stream videos. I am currently trying to integrate Wowza, Android, and ChromeCast Device (CCD). According to this document, https://developers.google.com/cast/docs/media, Google Cast supports the "MP4 protocol".
So, my question is this: is MP4 a streaming protocol, file format, or both?
In the ChromeCast Android demo applications, they simply pass a URL like this http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4 as metadata to the CCD.
To me, this implies that no server is required to stream the MP4 file. Meaning, I won't even need Wowza as an intermediary party to stream.
Is this understanding correct?
It seems that the client player will then be responsible to interact with the MP4 file directly (e.g. seek, pause, stop, play, etc...).
While you've already accepted an answer, and gotten your app to work (which was likely your ultimate goal), I thought it might be helpful to answer your question as well about what MP4 really is.
MP4 is a video container format; inside the MP4 container is video stream data (generally encoded in the H.264 format) and audio stream data (often encoded in the AAC format). The client player can interact with it directly because the Chromecast's browser has HTML5 video support for interpreting the MP4 container format and playing back the H.264 video and AAC audio, but it isn't "streaming" in the way that term is often used ... it's just downloading it from your web server in chunks and playing it back. There's nothing wrong with this if it's performing as you'd like (in fact, this is one of the big benefits of HTML5 video, that it doesn't need a streaming server backend), but if you actually want true media streaming (to leverage things such as adaptive bitrate switching, licensing, and so forth), you would have the MP4 file served via Wowza rather than via your web server.
If you simply have an MP4 file, just pass its url and it should work fine, just like the samples (CastVideos) projects that we have on the Github.

Categories

Resources