How to convert normal URI to dash URI? - android

Currently, I have been making streaming video player app. so for that reason I want to use dash streaming. I have a normal URI of video from my firebase storage. but for dash streaming, I think I need a file that ends with .mpd.
ExoPlayer player = new ExoPlayer.Builder(context).build();
player.setMediaItem(MediaItem.fromUri(**dashUri**));
player.prepare();
what I have to do to convert normal to URI which ends with .mpd.
So, how can I do that?

You actually have to covert the video file to a fragmented format and typically will want to make it available in multiple bit rates, which means transcoding it also.
The reason for this is that DASH is an ABR protocol - it breaks multiple renditions of a video into equal size, time wise, chunks and the player can then request chunk by chunk, choosing the best bit rate version of each chunk depending on the current network conditions and the device type.
See here for more info: https://stackoverflow.com/a/42365034/334402
Open source tools exist to create DASH files from mp4 - see some examples here (links correct at time of writing):
https://github.com/gpac/gpac/wiki/DASH-Support-in-MP4Box
https://www.ffmpeg.org/ffmpeg-formats.html#dash-2

Related

How streaming apps change video quality based on changes in network speed?

How streaming apps like Youtube, Hotstar or any other video player app, programmatically detects if network is getting slow over run-time and based on that they change video quality based on changes in network speed?
Many streaming services nowadays use HTTP-based streaming protocols. But there are exceptions; especially with low-latency streaming; e.g. WebRTC or Websocket-based solutions.
Assuming that you're using a HTTP-based protocol like HLS or MPEG-DASH, the "stream" is a long chain of video segments that are downloaded one after another. A video segment is a file in "TS" or "MP4" format (in some MP4 cases, video and audio are splitted into separate files); typically a segment has 2 or 6 or 10 seconds of audio and/or video.
Based on the playlist or manifest (or sometimes simply from decoding the segment), the player knows how many seconds of a single segment contains. It also knows how long it took to download that segment. You can measure the available bandwidth by diving the (average) size of a video segment file by the (average) time it took to download.
At the moment that it takes more time to download a segment than to play it, you know that the player will stall as soon as the buffer is empty; stalling is generally referred to as "buffering". Adaptive Bitrate (aka. ABR) is a technique that tries to prevent buffering; see https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming (or Google for the expression) - when the player notices that the available bandwidth is lower than the bit rate of the video stream, it can switch to another version of the same stream that has a lower bit rate (typically achieved by higher compression and/or lower resolution - which results in less quality, but that's better than buffering)
PS #1: WebRTC and Websocket-based streaming solutions cannot use this measuring trick and must implement other solutions
PS #2: New/upcoming variants of HLS (eg. LL-HLS and LHLS) and MPEG-DASH use other HTTP technologies (like chunked-transfer or HTTP PUSH) to achieve lower latency - these typically do not work well with the mentioned measuring technique and use different techniques which I consider outside scope here.
You have to use a streaming server in order to do that. Wowza server is one of them (not free). The client and server will exchange information about the connexion and distribute chuncks of the video, depending on the network speed.

Getting unstreamable error with AndroidBreakout

I've been exploring the documentation and examples at http://bigflake.com/mediacodec/ by Fadden, and applied patch http://bigflake.com/mediacodec/0001-Record-game-into-.mp4.patch to the breakout game. Unfortunately, after compiling the code, I realized it doesn't work, producing video files that aren't streamable.
I see the following error:
"The mp4 file will not be streamable."
According to Fadden, this should be fixed by checking the mBufferInfo.flags (https://stackoverflow.com/questions/23934087/non-streamable-video-file-created-with-mediamuxer), which is already done in his code, so I'm at a complete loss. Did anyone else get the video recording patch to work?
The warning you're seeing is just a warning, nothing more. MP4 files aren't streamable anyway in most cases, in the sense that you would be able to pass the written MP4 over a pipe and have the other end play it back (unless you resort to a lot of extra trickery, or use fragmented MP4 which the android MP4 muxer doesn't write normally). What streamable means here is that once you have the final MP4 file, you can start playing it back without having to seek to the end of the file (which playback over HTTP can do e.g. with HTTP byte range requests).
To write a streamable MP4, the muxer tries to guess how large your file will be, and reserves a correspondingly large area at the start of the file to write the file index to. If the file turns out to be larger so the index doesn't fit into the reserved area, it needs to be written at the end of the file. See lines 506-519 in https://android.googlesource.com/platform/frameworks/av/+/lollipop-release/media/libstagefright/MPEG4Writer.cpp for more info about this guess. Basically the guess seems to boil down to: "The default MAX _MOOV_BOX_SIZE value is based on about 3 minute video recording with a bit rate about 3 Mbps, because statistics also show that most of the video captured are going to be less than 3 minutes."
If you want to turn such a non-streamable MP4 file into a streamable one, you can use the qt-faststart tool from libav/ffmpeg, which just reorders the blocks in the file.
You can check Intel INDE Media for Mobile, it allows to make game capturing and streaming to network:
https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials
simplest capturing:
https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials-video-capturing-for-opengl-applications
youtube streaming:
https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials-video-streaming-from-device-to-youtube

video encoder for a picture + voice (supported by mobile phones)

I wonder what is the best option to store a single picture and short voice memo in one file? That needs to be openable by mobile phones in browser (iOS, Android) and preferably be shown as a single full screen photo and sound playing in background.
Effectively i'm looking for a most size efficient combination of something like MP3 + JPG.
If i do it in a single .mov i guess i loose a lot of space due to compression of each and the same frame 24 frames/second.
A rough list of options which comes to mind:
.mov
Mpeg4
H.264
QuickTime
HTML5 video format (Theora?)
store it in Flash (but this is not supported by iOS)
EDIT1:
tried storing h.264 in .mp4 container, files are small enough (around 1Mb), but somehow it does not work on an Android phone of my friend. Probably i need more testing, but it seems Android OS does not like proprietary codecs...
My most intuitive solution for this would be to store a JPEG and an MP3 separately on the server. To download one entity as a single unit, download a bit of JSON or XML data that contains pointers to the picture and the audio file.
If you are set on having one file do the job, you might try embedding the JPEG inside the ID3 metadata of an MP3 file (this type of metadata functionality exists to, e.g., store album art with a music file). You would want to make sure that the ID3 tag is near the start of the file. JavaScript within the mobile browser could fetch the file, a third party library could do the ID3 parsing (some Googling reveals that such libraries exist; don't know if they all support fetching a JPEG image). Then the JS would need to be able to feed the file into an audio tag for playback, which I'm not sure is possible.
Another thing to experiment with is a .MP4 which encodes the audio track along with a single video frame with a really, reeeaaallly long duration. You would have to experiment to determine if the mobile browsers handle that gracefully while also allowing smooth audio seeking. If they don't, then perhaps re-encode the frame with every 1-5 seconds to keep the bitrate minimal.

H.264 Real-time Streaming, Timestamp in NAL Units?

I'm trying to build a system that live-streams video and audio captured by android phones. Video and auido are captured on the android side using MediaRecorder, and then pushed directly to a server written in python. Clients should access this live feed using their browser, so the I implemented the streaming part of the system using flash. Right now both video and audio content appear on the client side, but the problem is that they are out of sync. I'm sure this is caused by wrong timestamp values in flash (currently I increment ts by 60ms for a frame of video, but clearly this value should be variable).
The audio is encoded into amr on the android phone, so I know exactly each frame of amr is 20ms. However, this is not the case with video, which is encoded into H.264. To synchronized them together, I would have to know exactly how many millisecs each frame of H.264 lasts, so that I can timestamp them later when delivering content using flash. My question is is this kind of information available in NAL units of H.264? I tried to find the answer in H.264 standard, but the information there is just overwhelming.
Can someone please point me at the right direction? Thanks.
Timestamps are not in NAL units, but are typically part of RTP. RTP/RTCP also takes care of media synchronisation.
The RTP payload format for H.264 might also be of interest to you.
If you are not using RTP, are you just sending raw data units over the network?

Accessing an mpeg-4 encoded RTSP live stream (Axis server) in Android with the .amp extension

I have a Samsung Galaxy Tab (Android 2.2 Froyo, etc) that I am developing on. I need the application to access a stream via IP, and the stream is originating from an Axis 241Q. The Axis is set to use .mp4 encoding, and I see that Android supports .mp4 natively. The Axis server also provides an RTSP URI to access the stream from media players via the local network.
Let me lead in to this by saying that I know less than nothing about video encoding standards and containers, so I apologize if this is a "no duh" issue.
My question is, how do I get to this stream using an Android VideoView? The Cliff's Notes version of the code I would use to start up the view in my Activity's onCreate():
VideoView v = (VideoView) findViewById(R.id.feed);
Uri video = Uri.parse("rtsp://local/path/to/feed.amp");
v.setVideoURI(video);
v.start();
I've used this with some test .3gp stream URI's that I've found on the internet and it seems to work fine, but all of the test streams that I found were done over HTTP and not RTSP so maybe I have to do a little more magic to get RTSP going; I can't imagine why that'd be though. I do know that Android supports RTSP in URI String resources for its MediaPlayers. Then again, I know nothing about streaming video so I may be wrong in assuming that it works the exact same way.
Regardless, when I attempt to access the Axis feed locally, the feed will not load; my assumption is that this results from use of the .amp extension instead of the ones listed in the Android docs but I have absolutely no idea. I can pass this URI to QuickTime and other such media players with positive results so I'm also assuming that the .amp file extension isn't THAT bizarre. I've had a hard time really finding out because Googling .amp with anything else, even using quotes and whatnot, yields a tedious set of results because of "amp" showing up in HTML escape characters.
The first question is, am I missing something obvious? I'm thinking not but there's a good chance that it's so.
The second question: is there a simple way to access this RTSP stream without having to brew up an insane solution on my own? Any existing and FREE libraries that are already in the wild and could make this easier on me would be a huge help. I was initially going to try out the gstreamer java bindings but after looking at the project page I saw that gstreamer relies on Swing and I don't believe Swing is included in the Android Java jars.
Can you provide the MPEG4 configuration of the stream?
Extension .amp can be replaced with .3gp on all Axis products. So try Uri.parse("rtsp://local/path/to/feed.3gp");. But, extension shouldn't make any difference in RTSP because media stream is determined by SDP, and not its "extension". So it can be media.jpg and server will actually stream H264 video, and not JPEG image.
If that doesn't work, try to configure your MPEG4 stream and be sure that you check ISMA compliant and set Video object type: SIMPLE (not Advanced Simple). That stream now can be played on all media players that decode MPEG4.
If you have difficulties, comment here, and I will update my answer to add new stuff.

Categories

Resources