I'm trying to stream audio and video from a Google glass to a browser. The broswer just has to receive the video and audio.
I compiled the google source code following the instructions here http://www.webrtc.org/native-code/android.
So far, it works. But I'm having an issue with the video. It's displaying in grayscale, and I'm not sure what are the changes that I should do on the source code in order to fix this.
Here is a screenshot of the problem:
I found two related issues in stackoverflow.com, but I didn't get the solution:
VP8 Encoding results in grayscale image on Google Glass
VP8 encode/decode on android results in black and white image with red, green and blue squares
Thanks very much for any help that you can provide!
Per the first answer you gave, you likely need to compensate for a bug in the camera code for Glass. The image capture code probably thinks it's getting YV12, and actually is getting NV21, so the simplest thing to do is to convert NV21 to something else (like i420, which is the common internal video representation used). Alternatively, change the frame objects to say they're NV21 and let the rest of the code handle it.
In my ndroid application, I need to display H264 streams from a GrandStream IP Camera. I saw some topics about decoding H264 frames with MediaCodec in Android, but I really don't know where to start.
Before searching this topic, I thought that there were planty of open source library for that purpose but It seems there is not!
Can you show me a way where to start? Should I use Android's MediaCodec or is there any open source Java library for that?
You can refer to this site, It has a very thoroughly discussion and sample about Android Media Codec
I am trying to develop a simple application that show the video stream from an IP camera into a surfaceview.
I am totally new to video decode/encode. In the last few days I have read a lot of information about mediacodec API and about how to implement it, but I can not find the right way. I still have to fully understand how buffers works and how depacketize the RTP packets from UDP and pass each frame to MediaCodec
I have a couple of Sony EP521 IP camera. From the CGI Command Manual I get that the cameras support Mpeg-4/H264 HTTP bit stream ("GET /h264...", the camera will send H.264 raw data as its response.) or RTP (UDP) bit stream.
My problem is that I do not know where to start:
Which is the "best" way to implement this? (with best I mean the most reliable/correct but still easy way)
Should I use HTTP bit stream or RTP?
Are MediaCodec strictly needed or can I implement this in another way? (ie, android.media.mediaplayer class already support h.264 raw data over RTP (I do not if it actually does or not))
How can I extract the video data from an HTTP bit stream?
I know that there are a lot if similar question, but no one seems to fully answer my doubts.
The camera also support MJpeg. This would be easier to implement, but for the moment I do not want to use MJpeg encoding.
Here the Camera CGI manual: http://wikisend.com/download/740040/G5%20Camera%20CGI%20manual.pdf
Thank you, and sorry If already been discussed.
I am streaming live video from my camera on my android phone to my computer using the MediaRecorder class.
recorder.setCamera(mCamera);
recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setOutputFile(uav_UDP_Client.pfd.getFileDescriptor());
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
That's the basic idea. So I would like to show this stream in real time. My plan is to use FFMpeg to turn the latest frame into a .bmp and show the .bmp on my C# program every time there is a new frame.
The problem is there is no header until I stop the recording. So I can not use FFMpeg unless there is a header. I've looked at spydroid and using RTP but I do not want to use this method for various reasons.
Any ideas on how I can do this easily?
You can consider streaming a MPEG2 TS and playing it back on your screen or you can also stream H.264 data over RTP and use a client to decode and display the same.
In Android, there is a sample executable which performs RTP packetization of H.264 stream and streams it over the network. You can find more details about the MyTransmitter from this file, which could serve as a good reference to your solution.
Additional Information
In Android 4.2.0 release onwards, there is a similar feature supported by the framework called Miracast or Wi-Fi Display which is standardized by Wi-Fi forum, which is a slightly complex use-case.
Closed. This question needs to be more focused. It is not currently accepting answers.
Closed 1 year ago.
Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
I've seen plenty of info about how to stream video from the server to an android device, but not much about the other way, ala Qik. Could someone point me in the right direction here, or give me some advice on how to approach this?
I have hosted an open-source project to enable Android phone to IP camera:
http://code.google.com/p/ipcamera-for-android
Raw video data is fetched from LocalSocket, and the MDAT MOOV of MP4 was checked first before streaming. The live video is packed in FLV format, and can be played via Flash video player with a build in web server :)
Took me some time, but I finally manage do make an app that does just that. Check out the google code page if you're interested: http://code.google.com/p/spydroid-ipcamera/
I added loads of comments in my code (mainly, look at CameraStreamer.java), so it should be pretty self-explanatory.
The hard part was actually to understand the RFC 3984 and implement a proper algorithm for the packetization process. (This algorithm actually turns the mpeg4/h.264 stream produced by the MediaRecorder into a nice rtp stream, according to the rfc)
Bye
I'm looking into this as well, and while I don't have a good solution for you I did manage to dig up SIPDroid's video code:
http://code.google.com/p/sipdroid/source/browse/trunk/src/org/sipdroid/sipua/ui/VideoCamera.java
I've built an open-source SDK called Kickflip to make streaming video from Android a painless experience.
The SDK demonstrates use of Android 4.3's MediaCodec API to direct the device hardware encoder's packets directly to FFmpeg for RTMP (with librtmp) or HLS streaming of H.264 / AAC. It also demonstrates realtime OpenGL Effects (titling, chroma key, fades) and background recording.
Thanks SO, and especially, fadden.
Here is complete article about streaming android camera video to a webpage.
Android Streaming Live Camera Video to Web Page
Used libstreaming on android app
On server side Wowza Media Engine is used to decode the video stream
Finally jWplayer is used to play the video on a webpage.
I am able to send the live camera video from mobile to my server.using this link
see the link
Refer the above link.there is a sample application in that link. Just you need to set your service url in RecordActivity.class.
Example as:
ffmpeg_link="rtmp://yourserveripaddress:1935/live/venkat";
we can able to send H263 and H264 type videos using that link.
Check Yasea library
Yasea is an Android streaming client. It encodes YUV and PCM data from
camera and microphone to H.264/AAC, encapsulates in FLV and transmits
over RTMP.
Feature:
Android mini API 16.
H.264/AAC hard encoding.
H.264 soft encoding.
RTMP streaming with state callback handler.
Portrait and landscape dynamic orientation.
Front and back cameras hot switch.
Recording to MP4 while streaming.
Mux (my company) has an open source android app that streams RTMP to a server, including setting up the camera and user interactions. It's built to stream to Mux's live streaming API but can easily stream to any RTMP entrypoint.
Depending by your budget, you can use a Raspberry Pi Camera that can send images to a server. I add here two tutorials where you can find many more details:
This tutorial show you how to use a Raspberry Pi Camera and display images on Android device
This is the second tutorial where you can find a series of tutorial about real-time video streaming between camera and android device