I already have a http video stream coming from an IP camera in my android hotspot. I want to publish that stream in a red5pro server or in another media service. How can I do that?
The red5pro sdk examples only use a camera from android. I want to use a http stream came from IP camera connected in my android hotspot.
Using a player like VideoJS, set the tech order to prefer HTML5 over Flash and then ensure that you're using the appropriate URL for the stream.
An example URL patter would look something like this:
http://server:5080/live/hls/streamName.m3u8
Note that it ends in .m3u8, has the right port and uses the context and app names.
For more information please see our documentation page. Please let me know if this helps you or if anything isn't clear.
https://www.red5pro.com/docs/
Related
I just wanna create an app that displays the live cctv through nvr using p2p connection but I can't get any details about what is the data I am gonna get so I can handle. It is the first time for me to use p2p connection and dealing with cctv cameras.
My question is how this data should look like I am not gonna mention a certain company but for the majority of them how this should be?
I am using flutter/dart. I tried searching for docs but what I get is only apps to do this and I wanna know the mechanism.
Thanks in advance.
You will need to decode the video stream and display the images on the screen. You can try video_player or low-level library such as flutter_ffmpeg inorder to decode the video stream and display it on the screen.
I'm developing an android app where user can start live streaming using his/her android-camera. I have AWS & GCP resources at hand. What I understand after some reading is,
I have to stream/publish, whatever android-camera is picking, to some server over some protocols(RTMP or HLS, etc..)
I have to setup server that will pull this input source and packages & stores it in a form that could be streamed/consumed on the mobile/web browser(Basically, an URL) and I believe AWS's MediaLive, MediaPackage, etc.. resources should do that.
I could use this URL are MediaSource for players on Android(like ExoPlayer)
My problem is I couldn't find good documentation on 1st part. I found this, https://github.com/bytedeco/javacv, which doesn't appear to be production-level work. While trying out 2nd part, while creating MediaLive channel on AWS, I was asked to point the channel to 2 destinations(I don't know what it means) which made me doubt my understanding of this process. I'm looking for some skeleton-procedure with official documentation on how to achieve this.
EDIT 1:
For the Input-Production part, I'm experimenting with this answer. https://stackoverflow.com/a/29061628/3881561
EDIT 2:
I've used https://github.com/ant-media/LiveVideoBroadcaster to send video source to RTMP server. I've created RTMP push input source in MediaLive and a channel with output - Archive(stores .ts files in S3). Now that the flow is working, How can I modify this architecture to allow multiple users to create live streaming?
I need to implement Multi-track Audio Support feature in Sender application. Means, When user casts any video/movie on TV using Chromecast, then on sender application, user should be able to see all available audio tracks and choose desired audio track to cast on TV.
In Developers website, I saw Tracks API, in that MediaTrack object, which can be used to configure a track.
https://developers.google.com/cast/docs/android_sender#tracks
Here, I am not getting, how to fetch all available audio tracks from selected mp4 file? Can anyone provide the direction to work with this?
what will be the role of receiver application, regarding this?
I looked at given reference sample app. From that app, I am clear of how to set MediaTracks.
https://github.com/googlecast/CastVideos-android
I am not getting how to extract audio tracks from mp4 file, so that we can set them as MediaTracks?
Is this part needs to be done in receiver app?
Any help will be highly appreciated.
Thanks.
Currently, in the Cast SDK, there is no support for multiple embedded audio tracks in an mp4 file
I am currently developing a media application for Android that can cast it's media to a Google Chromecast.
This application is not meant to be published so some of the unorthodox choices I have made (web server, local files etc.) isn't part of the discussion.
The problem is that when I load a media stream to the Chromecast, or seek in the stream, it doesn't change. The callback says it succeeds, but the movie keeps playing from scratch (when I load with initial position) and doesn't change when I perform remoteMediaPlayer.seek(XX).
The movie that is playing locally is a file on the SD card. At the same time, the phone runs a Web server (KWS) and the chromecast is fed with this URL. This works perfecly otherwise - I can play, pause and change the movie playing. No problem there. But I do have the seek problem.
The reason I'm mentioning the web server is because when I run the URL "http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4" from googles samples, the seeking work.
Could anyone help me figure out what is different because I host the file myself? I saw posts mentioning that if the stream doesn't have a duration, remoteMediaPlayer cannot seek in it. But it does. A call to remoteMediaPlayer.getStreamDuration() returns the actual media duration.
There seems to be something in missing in how I host the files...
All help is appreciated.
I have now solved my problem. If anyone else has this or a similar problem, use a web server that supports http progressive download. In the android specific case, this was an app called Boa Web server.
I solved similar problem with NanoHttpd, solution was to add this response header
Accept-Ranges: bytes
I want to get data using serial port via audio jack. I don't have any idea about this.
I found the app square they read the data through port.so does any one know how to get data or is there any projects similar like this.
Romotive, a project that helps you make robots out of smartphones uses the audio jack for data transfer, and I think the software is open source.