I'm using yesea repository to live stream to YouTube. Whenever I publish the stream, from YouTube Live Control Room I can see that the stream is LIVE.
However, when I try to play/watch the live video. This happens
I have enabled DVR and RecordFromStart but when I try to play the video after I finish the stream, this happens:
I don't understand what's the problem, since the streaming status is good. Why the video can't be played? On the logcat I'm receiving a-lot of logs that that much audio is packed and sent... But there isn't any log for video, is that normal?
This question is viewed 51 times and no one has answered it, since, I have solved the problem, I'm answering in a hope that it will be helpful for someone.
On the Logcat I'm receiving a lot of logs that that much audio is packed and sent... But there isn't any log for video, is that normal?
Yes, this is normal. They don't have Log for video encoding.
I wasn't able to play the video because of 2 reasons.
I have set the mode private.
My connection speed was too slow, it couldn't send video encoded pieces to YouTube server.
Related
I need to stream rtsp-video from IP camera in local network to my android app. It's very easy to use VideoView and play it as url, or SurfaceView and play stream on it with native MediaPlayer. But when I stream that way - I've recieved a 6-second delay when my phone is buffering that video. As I read, there is no way to change buffer size of MediaPlayer. But I saw several apps that stream video from my camera in almost real-time. I've read a lot about this - cause I'm not the first one who encountered this problem - but didn't find any useful info.
Many thanks for any help!
I'm using vlc-android, it works well to play my cameras' rtsp links:
https://github.com/mrmaffen/vlc-android-sdk#get-it-via-maven-central
The delay is about 1 second.
I have never worked on video related project, however we have to now.
1. What we tried to do
Build an Andriod application which can take the real time steam of video an audio.
Send the captured stream to Server
Other clients(Either Android client or iOS or HTML5) can view these streams
All of the above three steps should work at the same time.
Video streamed to server should be cached by future play.
2. What I know at the moment
I have searched at google and sf to see if someone have the same requirement.
After that I know a little about the video transformation:
Protocol:
RTSP/RTP/RTCP
RTSP: control the state of the transformation like PLAY,PAUSE,STOP..
RTP: which do the real transport job
RTCP: work in conjunction with RTP(synchronize the stream)
HTTP:
1) Download the small pieces of the video file and play them, use the `range-requset` to control the download(play) location.
2) HLS by Apple. Even it said it is live stream, it is based on `.m3u8` file, by updating the index of which to do the live job.
RTMP by Adobe.
Encoding:
Nothing I know yet.
And it seems that RTSP/RTP/RTCP can be used for both uploading to server and playing at the client. So it apply for the application which need high real-timing. However since the RTSP/RTP/RTCP based on TCP/UDP so getting through the Router would be a problem.
While the HTTP can only be used for playing at the client(Technologically you can stream the small pieces of file by HTTP, but I think it is not a good idea), so it can be use to play the existed video stream either from file or something else. And you don't worry about the Router, which means it can be used under complex network environment.
For our application, since we do not have a strict requirement for the real-timing during the playing. So we tried to stream the video source from the Android client to Server by RTSP/RTP/RTCP, and serve these streams by HTTP.
2. Questions:
Anything wrong in all of the above?
Is it posssible of my idea:streaming by RTSP/RTP/RTCP and serving
by HTTP.
If yes, it seems that the Server shoud do something to cache the video to a proper format for further serving. I am not sure if this job can be done by a Video Server out of the box, or by myself?
What should I know more about the streaming development(at least for
my current project)? Any tutorial are welcome.
First of all, sorry for my poor English.
I am developing an application to monitor and get statistics on youtube videos played. I have developed an application on Android withit an embedded Youtube video player and I want to get all events that happen during the video is playing. I know how to get some events such as when it has been stopped, when it is buffering or when it stops buffering.
The problem is that it would be very interesting for me to get some statistics about the bytes that are loaded in the buffer everytime I check it but I don't know how to do that. I have checked the youtube player API and I know that it is possible when you use Javascript but the API that they offer for Android doesn´t have this option.
Please do you have any idea about how I can monitorize the buffer during a video is playing?
Thank you!
I'm trying to create an app to stream live TV. Currently the problem I'm facing is that after say 10 minutes of playing, the video will freeze but the audio will carry on. This is on a 1.3mbps stream. I also have lower streams, such as a 384kbps stream, that might last an hour or so, but will still do the same. I've tested this with a local video, that is high quality (file size is 2.3gb) and that has no lag and doesn't freeze at all, so it must be something to do with the way HLS is streamed to android.
Does anyone have any idea on how to solve this problem?
Thanks
I am using YouTube’s RTSP link to demo some capability to the client. Occasionally the mediaplayer fails to display any video (Blank screen). However I am able to hear the sound. Can someone help me understand what is going wrong?
Thanks for the help in advance
I dont know much about youtubes RTSP, but I have had similar problems with youtube not loading the video (but the sound is ok) when the playback quality is changed. I bet that is where your problem lies. I would make sure that you specify playback quality and everything and request that feed before loading it into your player if at all possible.
The Android Emulator doesn't work with streaming RTSP video. So I suggest you should test your streaming application (video) on real device.
Hope it is useful for you