I Don't know here is the right place but How can I capture screen while playing video in an android device especially android 2.3.1.
When I Capture screen, just see play pause buttons and a black screen instead of video content.
Thanx
It can be achieved if you play your video on surface texture. To play video on that surface texture would be needing bitmap(which will run in thread). you can then capture the bitmap on touch event for surface texture and convert it to jpeg in your desired folder.
According to this thread, it doesn't seems to be possible (I had the same problem).
The only solution I found was to play the video on a Galaxy tab 10.1 and to use the screenshot function of the tablet, but it require to have a Galaxy tab 10.1 :/
Jokahero
Related
I have tried by setting the videosize(width,height) for mediarecorder, but this has device compatibility issue, on few devices its crashing at mediarecorder.start();
If the device happens to support a square video size, you are welcome to use it. Most will not.
Vine, based on the last reports that I heard, does not use MediaRecorder. Instead, they use preview frames from the camera, which they crop to be square and assemble into a video. I do not know what Instagram does.
You are also welcome to record a non-square video, then post-process the video yourself to crop it to be square.
When playing back videos on the Samsung Galaxy S4 the video does not recognize the orientation metadata. It always plays on landscape.
The app also records the video and plays correctly on all other devices while streaming. The S4 plays it fine only if the video is stored in the device but it won't work for streaming.
Using MediaPlayer and SurfaceView on a Fragment, not a VideoView.
Have tried to disable Air View, Air Gesture, Smart Stay, Smart Scroll and Auto Rotate without luck.
Also even stored the orientation hint along with the video to rotate manually the element in the layout, but the SurfaceView when rotated using lockCanvas doesn't work and when rotating its parent element goes black and only plays the audio.
Any suggestions I can try to get this bug fixed? Have you experiencing the same when using the S4? Any help on this will be greatly appreciated. Thanks!
I asked this question a few months back and haven't heard anything. I would love to know as well.
https://stackoverflow.com/questions/17950072/galaxy-s4-media-player-ignores-rotation-metadata
Edit: This also occurs on the Galaxy Note 3.
Found a workaround for this issue. Try using a TextureView instead a SurfaceView, before playing the video, get the rotation info with MetadataRetriever and adjust the TextureView as needed. It worked on this side.
See the details here:
Streaming Video Playback Orientation Issue # Samsung Developers Forum
I've an app that uses a Native Activity as it is mostly C++ code (cross platform).
I also need to play Video and Audio so have another Activity that uses MediaPlayer to play video and/or audio. This is used in conjunction with the Native Activity to play Video in front of the OpenGL rendered application.
The MediaPlayer activity is based upon that in the ApiDemos example, and works fine on MOST devices (Nexus 7/10, Galaxy S3/Tab2, even some low-cost devices like Fusion5).
BUT running it on a low-cost Scroll Plus 7" tablet - the code functions, but no Video is displayed.
The Audio plays, the Video Controls show and are usable.
NOTE: I have actually been able to hack with the code to get the Video showing - though this is not a solution it may help someone identify the problem.
In my Native Code where the Open Gl render calls eglSwapBuffers, if I comment this line out the Video shows. (I complete the rendering in a loop inside android_main)
Obviously this means my OpenGL Rendering that needs to go on in the background stops working, so I cannot fix it this way.
Device:
Scroll Plus
7" TABLET by Storage Options
Jelly Bean 4.1
From LogCat i believe it is running a CedarX-based Media Decoder/Renderer.
After 4 days on this issue, I've finally found out that the Video was being drawn BEHIND my OpenGL. Why only on this device I don't know. All the other aspects of the Video Activity (the controls etc) are in front where it should be.
The eglSwapBuffers was a red-herring, in that case I was not drawing anything to the screen.
With the SOUND AND SHOT : LISTEN TO YOUR PHOTOS feature:
Every picture you take on the Samsung Galaxy S4 can come with sound. So now you can remember what was said, played and heard not just what it looked like.
Now, I want to add this feature in my Android Application, in which I capture image from camera yet but now I want to capture with sound as above feature does. And, if I can do this then in which form that file will save, will it be an image or flash or a video and what will be extension for that file ?
Any help would be highly appreciable.
I am developing an android application in which a specific video is played when the poster of a specific movie is shown infront of the camera in android, i found many AR tutorial just show 3D object when detect a pattern, i need some advice to make application that can play video with AR application using android camera and qcar SDK
I don't know qcar, but of course you can put a SurfaceView upon an existing SurfaceView from your CameraPreview and play a video in it.
What I would do is implementing your poster-recognition logic, this will be tricky enough and then, If it works reliable, I would add another SurfaceView in which the video is played.
You will find out, that the CameraPreview surface has actually to be on top of the video surface.
I assume you are trying to play video on glsurface ,it is possible by using surface video texture ,but only ics(4.0) and above devices support surface video texture.Vuforia have a very good sample of this here,you can handle what to do for lower devices such as playing video fullscreen etc..