I'm beginner work with android. But I very need your help.
I have a project with streaming video Broadcasting video. I can not find good sample, where will be realized recording Video from Camera, sending (uploading) stream to server and downloading (getting streaM) from server to Player.
Please help me with this questions
Thanks.
You can try sipdroid open source project
https://code.google.com/p/sipdroid/source/checkout
Related
I have a source that transmits video in H264 format (in this case it is Colasoft Packet Player which transmits video stream through IP:PORT) and I want to be able to listen and receive the video stream in my Android app.
I read a lot of things across the internet about Socket, DatagramSocket and AudioManager, but I'm totally confused about what exactly I need and how to implement that.
What I need is to be able to capture the video frame by frame.
I would love to get some help.
You can use the VLC Android library
And here there's an explanation on how to embed it into your app.
You can let ffmpeg do this job.
Have a look at:
https://github.com/alphons/SegmentationStreaming
I am successful in streaming the RTSP live video via hikvision IPCamera to my android app. Now I want to record the streaming video in the mobile app itself. How would I do it? Can I have some guidance?
I solved the issue myself. I was successful in streaming the rtsp feed in my app using pedroSG94/vlc-example-streamplayer . But I was unable to record the stream in my android app. After a week of trial and failure using different techniques, I finally found the actual vlc-android compiled sdk posted by vlc themself. This library had both the streaming and recording feature which helped me accomplish my goal. Thank you vlc
I finally found the actual vlc-android compiled sdk posted by vlc themself.
VLC library for android implementation.
('org.videolan.android:libvlc-all:3.3.14')
Use this library for streaming and recording RTSP video
I have used javacv library to stream the live video from mobile to wowza media server and it's working fine. And I tried to record the live video in media server for broadcasting the video to multiple devices but I can hear only audio from the recorded video file.
Please let me know where I have done wrong and let me know the steps for re-stream the video to multiple device. I'm new to the Media server platform.
Thanks.
Hey I'm struggling with a real hard problem for a few days now, hope you guys can help me.
my android app records videos and uploads them to my WCF restful sever, the server streams back the files. all is working well on firefox and chrome, the video is beying streamed without any problems.
on the same android application i recorded the video there is a problem, short videos (under a minute) are beying streamed without any trouble, yet when the video becomes longer the mediaplayer seems to not load it.
I have tried changing the mediarecorder profile fileformat and videocodec yet no change, ive read mediaplayer only supports mpeg4 and 3gp video stream over HTTP and ive tried all the combinations.
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
CamcorderProfile profile = CamcorderProfile.get(CamcorderProfile.QUALITY_LOW);
profile.fileFormat =2;
profile.videoCodec=3;
mediaRecorder.setProfile(profile);
im pretty sure my server handles the streaming well since its works on browsers, also other formats like audio 3gp and mp3 stream fine on the app only the video isnt streaming.
can somebody put me on the right track here?
Thanks!
well after tampering around i finally found the cause to all this.
it seems the android mediarecorder appends the moov atom to the end of the recorder video yet when streaming with the mediaplayer the file must have its moov atom in the begining.
so there u go for anybody struggling with recorder videos not streaming back
In my current project, we want to preview an external camera in an Android device.
We have connected the camera through USB and we are receiving a video stream on our Android app.
How can I play the stream with the mediaplayer class ?
The camera is able to send any video format we want.
Thanks guys !
PS : If you have any other idea to preview a video from an external camera, i would be very happy :)
If it helps someone, we have found a solution :
We couldn't make the video work with through USB, but we send the video stream through WIFI using the rtsp protocol.
Have fun !