I just wanna create an app that displays the live cctv through nvr using p2p connection but I can't get any details about what is the data I am gonna get so I can handle. It is the first time for me to use p2p connection and dealing with cctv cameras.
My question is how this data should look like I am not gonna mention a certain company but for the majority of them how this should be?
I am using flutter/dart. I tried searching for docs but what I get is only apps to do this and I wanna know the mechanism.
Thanks in advance.
You will need to decode the video stream and display the images on the screen. You can try video_player or low-level library such as flutter_ffmpeg inorder to decode the video stream and display it on the screen.
Related
I want to stream images from one video-capture Android application to another Android application in real time on the same phone. The former video application records and processes images. I would like to stream these images as they are created directly into the second Android application. I have control over the implementation of both applications and can make the video-capture Android app run as a service.
I have found a lot of support online around how to send video or images between applications via Intents or broadcasts. However these methods aren't done in real time and require the full video file. Any direction at all would be greatly appreciated.
You can make a content provider (in the video-capture app) that allows other apps to open the content as an InputStream. The other app will read that InputStream. This should work more or less in real-time.
I already have a http video stream coming from an IP camera in my android hotspot. I want to publish that stream in a red5pro server or in another media service. How can I do that?
The red5pro sdk examples only use a camera from android. I want to use a http stream came from IP camera connected in my android hotspot.
Using a player like VideoJS, set the tech order to prefer HTML5 over Flash and then ensure that you're using the appropriate URL for the stream.
An example URL patter would look something like this:
http://server:5080/live/hls/streamName.m3u8
Note that it ends in .m3u8, has the right port and uses the context and app names.
For more information please see our documentation page. Please let me know if this helps you or if anything isn't clear.
https://www.red5pro.com/docs/
I am writing a custom camera application because I need to do real-time streaming and I need to access the byte array raw video data from a camera.
To simplify my work, I was wondering if I could get this raw data from the user's native camera application. I am guessing not because you must invoke it through an intent, and then you get a result. But, I need the real-time raw data, not a delayed result.
The reason for this request is that I just need to get the raw data and don't actually desire to do anything fancier than what the native camera can do.
If this is possible I'd be very grateful for any assistance!
Have you tried using setPreviewCallback()? It will continuously give you still images of the camera preview. The only caveat is that the frame rate will probably be very low, but I don't know your use case so it might be good enough.
I was wondering if I could get this raw data from the user's native camera application
Not in general. There are thousands of "native camera applications" for Android. None are required to publish any sort of real-time raw data feed, to the extent that is even possible given the additional layer of IPC overhead. It's entirely possible that somebody happens to have written a camera app that does this, and you're welcome to hunt around for it, but that may take a fair bit of time.
I want to develop an App which is just like snap chat,to send video files to friend.Is there any third party API to do this.What are the possible ways?
One way is using Sockets. You could basicly send information how many bytes the file has and then transfer it to other Socket where you will read it. You should take a look at this question.
Edit:
In this solution you don't need any third party, it is java api.
You can use sockets but you can also use the PubNut API if you want focus in your app and leave the networking to a third party.
I hope this help you.
Socket is good for transferring files, but i think you cannot transfer large files. And your application will allow lots of unwanted/unauthorized video being transferred between users.
You need to learn about XMPP protocol and Open Fire Server, which are widely used by applications like WhatsApp (not sure what they currently using).
This may help you i think...
Try: Here is the example link to transfer files usind asmack and openfire server http://harryjoy.me/2012/08/18/file-transfer-in-android-with-asmack-and-openfire/
I'm creating an app where users can upload data and pictures, which are stored in my server, and those data can be accessed by other users.
How do I send/receive data and pics to/from my server
I am planning to provide camera in the app, if so how do I manage the size of pics for easy accessing both ways
you should be more specific, about the technology you plan to use for the server side. Whatever you choose, Try with JSON for easy protocol on your data transfer from server to device, search for what frame work to use for the android part
here an example with ,
REST
Android ION library can help with sending images in both directions.
If you plan to use Camera Intent, you will have no control over the images you get back, so you will need to resize them on your side. If you use the Camera API (a.k.a. custom camera) in your app, you can choose any picture size supported by your device.