I am developing an android application in which a specific video is played when the poster of a specific movie is shown infront of the camera in android, i found many AR tutorial just show 3D object when detect a pattern, i need some advice to make application that can play video with AR application using android camera and qcar SDK
I don't know qcar, but of course you can put a SurfaceView upon an existing SurfaceView from your CameraPreview and play a video in it.
What I would do is implementing your poster-recognition logic, this will be tricky enough and then, If it works reliable, I would add another SurfaceView in which the video is played.
You will find out, that the CameraPreview surface has actually to be on top of the video surface.
I assume you are trying to play video on glsurface ,it is possible by using surface video texture ,but only ics(4.0) and above devices support surface video texture.Vuforia have a very good sample of this here,you can handle what to do for lower devices such as playing video fullscreen etc..
Related
I am struggling with recording and editing videos. For editing, I found a useful library in Android named ffmpeg4Android. However, I am still get stuck in recording video. Here is what I want:
1/ Add text or images in video while recording
2/ Add filter while recording video
I found there is a library GPUImage for Android but it has only some examples related to taking a photo, not to recording video)
Please let me know if you have any ways or any suggested libraries which can do it.
Thank you in advance!
Links which I read when researching:
Add overlay while record video on Android
How to Record video with GPUImage?
FFMpeg add text to actual video file after recording in Android
You can add an overlay image on video using a blend filter.
You cannot add text directly but you can add the text to the image writing on a canvas.
android-gpuimage library does not natively support video recording but you can try using the android-gpuimage-videorecording library. It is a fork of the gpu-image for android that provides also the video recording functionality
android-gpuimage-videorecording
see the GPUImageMovieWriter class
It should point you in the right direction for developing your own video writer on top of GPUImage.
The idea is to:
draw on current screen surface
switch to encoder input surface and draw previous frame buffer again on it
switch back to screen surface
other useful links: EGL surface helper, Media encoder
YouTube already has such function "play in CardBoard" which will reformat the footage to make it feel like you are watching in an Imax theatre.
How to do it with android VR Sdk. i'm takeing look at VrVideoView. When playing an normal video, It generate very strange view point for the normal 2D video and play it as 3D video.
The VrVideoView only renders the video in a 360° way and it gives an awfull result if the video is not a 360° video.
So you have to use a GvrView and manage the renderring youself in order to have a good result.
Here's a quick and dirty example about how to display a stereoscopic video with the Google VR SDK
https://github.com/Zanfas/Cardboard_VideoPlayer
You have to know that the Google VR SDK only apply distorsion to the rendering of a 3D Scene managed with Open GL ES.
The idea is to make a virtual screen in a 3D scene that displays the video. You render the 3D scene with the camera looking at that screen and there you go.
Regards
I want to use GPUImage Android to process video in real time. I see example that creating pictures with different filters but I didnt find any example of recording video with filters. Is this possible with GPUImage Android?
android-gpuimage library does not support video recording but you can try using the android-gpuimage-videorecording library. It is a fork of the gpu-image for android that provides also the video recording functionality
android-gpuimage-videorecording
see the GPUImageMovieWriter class
It should point you in the right direction for developing your own video writer on top of GPUImage.
The idea is to:
draw on current screen surface
switch to encoder input surface and draw previous frame buffer again on it
switch back to screen surface
other useful links: EGL surface helper, Media encoder
GPUVideo-android
I know GPUVideo-android. This library apply video filter on generate an Mp4 and on ExoPlayer video and Video Recording with Camera2.
Android MediaCodec API is used this library.
Let's try camera record , video preview and video generate.
I want to record video by camera in my Android device. I need to add overlay image over recorded movie. In iOS I would use GPUImage. In Android I found Android GPUImage. I tried to use it but I didn't found any way to add any filter while video recording. In provided example I could add filters only for taking photos. Is there any ways to record video with filters with Android GPUImage? Is there any other ways to add images overlay over recording video in realtime? If not, is there any ways to add images overlay over recorded video in postprocessing?
You can add an overlay image on video using a blend filter.
About video recording: android-gpuimage library does not support it but you can try using the android-gpuimage-videorecording library. It is a fork of the gpu-image for android that provides also the video recording functionality
android-gpuimage-videorecording
see the GPUImageMovieWriter class
It should point you in the right direction for developing your own video writer on top of GPUImage.
The idea is to:
draw on current screen surface
switch to encoder input surface and draw previous frame buffer again on it
switch back to screen surface
other useful links: EGL surface helper, Media encoder
This project (MagicCamera) has many multiple input filters. You can write your own fragment shader to overlay an image on camera texture (similar to MagicN1977Filter). It also includes video recording.
how to play video using opengl es in android?
Your question is a bit vague. If all you want to do is to play a video in a GL surface, it's possible. See the Media Player sample code that Dave was referring to in his sample. All you have to do there is to replace the SurfaceView with a GLSurfaceView in both the MediaPlayerDemo_Video.java file as well as in the corresponding layout file (mediaplayer_2.xml).
Also you need to create a custom Renderer class (one that implements the GLSurfaceView.Renderer interface) and set it to your GLSurfaceView.
After you do all this, you will have your video playing on a GL surface, but that would be all.
If, on the other hand, you want to manipulate the video, i.e. to draw the video frames in a GL texture and add effects to it or transform it (for example scale, rotate, etc.), then I'm afraid this can't be done. The reason is that you don't have access to the raw video frames in your application.
I'm not sure why you're metioning OpenGL here, but probably the easiest way to play video is using the VideoView class. You'll want to have a look at the MediaPlayer class too.
You might find the Audio and Video page in the Android Developer Documentation helpful.
There's VideoView sample code and MediaPlayer sample code in the API demos provided with the Android SDK.