android/ios: Play a video via opengl - android

I need to play a video on a OpenGL surface. I think I will need to render each frame of the video to a texture in a loop and then render it via OpenGL. is this possible under ios and/or android ?

It is possible on iOS, but it's pretty tricky business to get it to run fast enough to keep up with a video stream.
There is an old demo app from Apple called ChromaKey that takes a CVPixelBuffer from Core Video and maps it directly into an OpenGL texture without having to copy the data. That makes performance MUCH better, and is the approach I would suggest.
I don't know if there is more current sample code available that shows how it's done. That code is back from the days of iOS 6, and was written in Objective-C. (I would suggest doing new iOS development in Swift, since that's where Apple is putting its emphasis.)

Related

Rendering a video from GStreamer in VR (Oculus Quest 2)

I'm working on a robot that is controller via the VR headset and sends a real-time video feed to the headset.
I've chosen to go the native way on Android and now have everything I need to receive the video stream and encode it (using GStreamer) and also to send the control data to the robot via UDP.
The last thing to do (and the one I most struggle with as I nave no prior experience with computer graphics) is to draw the image (encoded camera feed) to the screen. In the last few days, I've been reading stuff about how Vulkan and OpenGL works, I've also went through the examples provided in Oculus Mobile SDK (mainly VRCubeWorld_SurfaceView) but that's way to complex for what I need, I've tried to simplify it so I could just draw two images, but then I thought.
Do I even need any of that? And this question might sound stupid, but I really don't have any prior experience doing this.
I mean, the example is using OpenGL to basically compute all the layers of the 3D scene, apply colors and then fuse them together to get a final frame that is passed to VR_API via the function:
vrapi_SubmitFrame2(appState.Ovr, &frameDesc);
Can I just take those images, and somehow force them into the frameDesc structure to skip the whole OpenGL pipeline? If so, can anyone knowledgeable enough point me to a working solution?
I don't need any kind of panning over the images, just to render them. Later I'll be using head sensor data, but it won't actually do anything with the "scene".

How to use Qt Multimedia and C++ to save an .mp4 video out of OpenGL textures

I am using Qt 5.9 based app which runs on embedded linux & android. The application processes real time data using OpenGL ES 3.0 & displays OpenGL textures at real time. I am displaying at the rate of 30+ frames per second which makes it pretty much real time & appears like a video.
I need to save an mp4 from a 30 to 40 frames that are displayed using OpenGL textures. As I understand, I can leverage Qt Multimedia to do this. But I lack the knowledge of how to do this. I am trying read & understand the how part from links like here & here.
One the mp4 is saved, playback can be done using QMediaPlayer as explained here. That looks darn simple. But I am struggling to figure how get my OpenGL textures saved into a .mp4 when I need them to.
So, How do I save a .mp4 video out of the OpenGL textures that are displayed on a QML item?
Pointing out to any basic example that exists would also help.
I don't think Qt will do you any favors when it comes to content creation, Qt's multimedia facilities are purely for content consumption purposes. You can play MM, not make MM.
You will have you explicitly use one of the many available MM libraries out there - vlc, ffmpeg, gstreamer, libav to name a few.

Video Capture and implementation Google Cardboard (like Paul Mccartney app)

Regards community
I just want to build a similar app like this,
with my own content of course.
How to capture 360 degree video (cameras, format, ingest, audio)?
Implementation:
2.1 Which one Cardboard SDK works best for my interests (Android or Unity)
2.2 Do you know any blogs, websites, tutorials, samples in which I can support.
Thank you
MovieTextures are a great way to do this in Unity, but unfortunately MovieTextures are not implemented on Android (maybe this will change in Unity 5). See the docs here:
For a simple wrap-a-texture-on-a-sphere app, the Cardboard Java SDK should work. But if you would rather use Unity due to other aspects of the app, the next best way to do this would be to allocate a RenderTexture and then grab the GL id and pass it to a native plugin that you would write.
This native code would be decoding the video stream, and for each frame it would fill the texture. Then Unity can handle the rest of the rendering, as detailed by the previous answer.
First of all, you need content, and to record stereo 360 video, you'll need a rig of at least 12 cameras. Such rigs can be purchased for GoPro cams. That's gonna be expensive.
The recently released Unity 5 is a great option and I strongly suggest using it. The most basic way of doing 360 stereo videos in Unity is to create two spheres with MovieTextures showing your 360 video. Then you turn them "inside out", so that they display their back faces instead of the front faces. This can be done with a simple shader, turning on front face culling and removing the mirror effect. Then you place your cameras inside the spheres. If you are using the google cardboard sdk camera rig, put the spheres on different culling layers and make the cameras only see the appropriate spheres. Remember to put the spheres in proper positions regarding the cameras.
There may be other ways to do this, leading to better results, but they won't be as simple. You can also look for some paid scripts/plugins/assets to do 360 video in VR.

What game engine/framework i could use to work with air mobile without fps problems?

I hear that most of flash engines have problems on mobile devices except Starling Framework . But starling didn't feed all my needs as pathfinding and tilemaps tool that i can perfectly use with flixel. What engine/framework i could use with Air or starling framework without miss much fps? I need collisions, pathfinding and tilemaps support.
As of this date (May 22, 2012) you will have to wait for that. There is not a single engine that rolls all of the above into one easy-to-use package like Flixel. The team working on Flixel has anncounced the intent to make such a thing (http://www.photonstorm.com/archives/2524/looking-for-developers-to-help-build-a-new-game-framework) But to date it does not exist.
That said, you could build a game using Box2D and some implementation of pathfindting (like a*) and just use starling to render the game. You'll sink some time in setup, but its not impossible. Before stage 3D i did a similar approach for this game:
http://www.candystand.com/play/cougar-town
It uses box2D to manipulate graphics.
(minus the pathfindong of course)
I released the source code of a similar game on my blog here:
It uses the old display list. But you could see how it works and switch it to startling.
http://plasticsturgeon.com/2011/05/box-2d-2-1a-cannon-game-part-3-the-complete-game-source/

android support better method than glreadpixels?

I'm making android game.(using andengine)
I need to record game play screen .
This is not for making promotion video, It is for game players to review their game play.
My app should record video by itself.
So I can't solve this problem using available recording app in market.
I already checked below code.
http://code.google.com/p/andengineexamples/source/browse/src/org/anddev/andengine/examples/ScreenCaptureExample.java?spec=svn66d3057f672175a8a21f9d06e7a045942331f65c&r=66d3057f672175a8a21f9d06e7a045942331f65c
It works very well..
But I want to record game play video, not a one screenshot.
At least I need 24fps for smooth replay, But If I use glreadpixels , I can get 5 fps at my xoom device.
I searched various websites to solve this optimization problem.
most people saying glreadplxels is too slow to record video.
http://www.gamedev.net/topic/473794-glreadpixel-takes-tooooo-much-time/
they recommend glcopyteximage2d instead of glreadpixels.
because glcopyteximage2d is much more faster than glreadpixels.
but I can't find how to use glcopyteximage2d in andengine.
even someone say that android opengl ES do not support glcopyteximage2d.
Maybe Another method exists to record smooth video.
It is read framebuffer of android device.
most of recording app in market using this method. but these app needs root permission to grab framebuffer.
I've read some news that android will be support capture screen from suface_flinger after gingerbread.
But I can't find out how to use framebuffer without root permission. T_T
These are my guessing solution.
use another opengl API which has better speed than glreadpixels.
find some android API can get framebuffer without root permission.
(Maybe I can access to android SURFACE_FLINGER ??)
draw another offscreen texture to record video.
But I don't know how to implement these methods.
Which approach is correct?
Do you have a example code to record video for android?
please help me to solve this problem.
If you know any other method, That will be helpful.
any help will be appreciated
Does the GPU vendor of your device support es3.0, if it does you can try to use PBO.
Here is a topic I you can refer to :Low readback performance with PBO , help !!!!!

Categories

Resources