I am developing an Android application that would stream a set of jpeg images over the internet. I need to decode each jpeg image into YUV format, perform some image related operations and then stream it after performing some encoding. Currently I am performing this conversion [JPEG -> YUV] in software. To speed up this conversion, I would like to use hardware decoders. Is there is any Android API that can do this coversion ? Can this be done using OMXCodec?
Regards,
John
Related
I am working on native app (no Java API calls) on which I need to encode camera's live feed and dump it. And avoid any memcpy to encoder's input buffer.
Previously I was able to capture yuv data from camera using AImage reader, and save it , also able to encode it by passing saved yuv it to encoder's input buffer , But now want to avoid saving and then passing it to encoder.
Is there any way we can achieve this using only AMediacodec API available in android ndk .
There is no guarantee that a built-in encoder will accept one of the image formats that you receive from the camera, but if it does, there is no need to 'save' the frame. The tricky part is that both camera and encoder work asynchronously, so the frames that you receive from AImageReader must be queued to be consumed by AMediaCodec. If you don't want to memcpy these threads to the queue, your camera may stumble when there are not free buffers,
But it may be easier and more efficient to wire the encoder to surface via AMediaCodec_createInputSurface() instead of relying on buffers.
I am working on Mediapipe real time Hand tracking application in for android. Provided demo is using Camera input from SurfaceTexture's ExternalOES I want to use network stream which is coming from webrtc. Network stream is in YUV_I420 format so I am converting it into RGB and creating RGB packet using AndroidPacketCreator like this.
Packet imagePacket = packetCreator.createRgbaImageFrame(yuv_converted_bimap);
and then passing it to mediapipe graph in FrameProcessor class
mediapipeGraph.addConsumablePacketToInputStream(
videoInputStream, imagePacket, custom_timestamp);
With this way everything is working fine only performance is degrading i.e with camera stream if it is able process 4-5 FPS then with this YUV to RGB approach it is processing only 2-3 FPS. I want to find another approach where I can use YUV stream directly to send mediapipe graph. I did some research but could not found anything. Anyone has any Idea how to do that?
The application I am working on is developed for Google Glass but runs on Android tablets as well.It uses VP8 encoding to transfer camera images to a remote application.
The preview format parameter on the camera is set to ImageFormat.YV12.
The VP8 encoder is initialized with VPX_IMG_FMT_YV12 parameter.
When the application .apk file is installed and run from the Glass, the image is displayed in gray scale on the remote application.
When the same .apk file is installed on a tablet or a phone, the image is displayed in proper colors.
I am wondering if anyone has any idea on where the problem could lie. Regards.
I finally figured out what is happening.
There is a bug in Google Glass camera module. Although it gladly accepts the requested image format of YV12, the preview buffer actually contains data in NV21 format.
I had to dump the camera preview buffer into a file and examine each byte just to figure this out:-(.
If you intend to use YV12 format, you may be better off using NV21 format for now until this bug is fixed.
I would like to convert multiple images(frames) to a video(MP4) in an android device. Also, I would like to convert video(MP4) into multiple images(for each frame). I have limited knowledge on FFMPEG, but installing FFMPEG in Android may consume more time. I Would like to ask experienced engineers to suggest a better strategy which can take less time to complete this task. Please point me to some open source code which I may modify to complete this task quickly.
First you need to convert Image to YUV format using Image Decoder.
Then you can feed each YUV Image as a Video Input to Media Recorder Engine.
Go through the Media Recorder Source Code to get more info.
I have got a Bluetooth application on android that sends jpeg to PC. I want to convert the jpeg to h.263 or any other video format and send the video stream via bluetooth.Is it possible to convert the streaming jpeg images to video format on Android or PC ?
I found one Windows application to convert jpeg images into Video format. Check this software, it may help you.
JPGVideo Download