How to create a video from an array of images in Android? - android

I want to call a function and build a video out of list of images, and then save it locally on the device:
public void CreateAndSaveVideoFile(List<Bitmap> MyBitmapArray)
{
// ..
}
Trials:
Following java/xuggle - encode array of images into a movie, the link in the answer is a dead link
Following How to encode images into a video file in Java through programming?, The suggested library in the accepted answer does not support Android.
The next answer in the above has an approach for Android users however it is not clear for me the input and the output of that function (where did he give the images? and where did he get the video?) - I left a question comment
The next answer in the above provides a whole class, however the required library to be included has a corrupted file (when I try and download it from the provided link) - I left a question comment
Following Java: How do I create a movie from an array of images?, the suggested library in the top answer uses commands that I am not familiar with and I don't even know how to use them. Like:
Creating an MPEG-4 file from all the JPEG files in the current
directory:
mencoder mf://*.jpg -mf w=800:h=600:fps=25:type=jpg -ovc lavc \
-lavcopts vcodec=mpeg4:mbd=2:trell -oac copy -o output.avi
I don't know how can I use the above in a Java / Android project..
Can anyone help in guiding me or/and providing me with an approach to my task? Thanks in advance.

You can use jcodec SequenceEncoder to convert sequence of images to MP4 file.
Sample code :
import org.jcodec.api.awt.SequenceEncoder;
...
SequenceEncoder enc = new SequenceEncoder(new File("filename"));
// GOP size will be supported in 0.2
// enc.getEncoder().setKeyInterval(25);
for(...) {
BufferedImage image = ... // Obtain an image to encode
enc.encodeImage(image);
}
enc.finish();
It's a java library so it's easy to import it into Android project, you don't have to use NDK unlike ffmpeg.
Refer http://jcodec.org/ for sample code & downloads.

Using JCodec as demonstrated by Stanislav Vitvitskyy here.
public static void main(String[] args) throws IOException {
SequenceEncoder encoder = new SequenceEncoder(new File("video.mp4"));
for (int i = 1; i < 100; i++) {
BufferedImage bi = ImageIO.read(new File(String.format("img%08d.png", i)));
encoder.encodeImage(bi);
}
encoder.finish();}
Now to convert your Bitmap to BufferedImage you can use this class:
import java.awt.image.BufferedImage;
import java.awt.image.DataBufferByte;
import java.awt.image.DataBufferInt;
import java.io.IOException;
import java.io.InputStream;
/**
* Utility class for loading windows bitmap files
* <p>
* Based on code from author Abdul Bezrati and Pepijn Van Eeckhoudt
*/
public class BitmapLoader {
/**
* Static method to load a bitmap file based on the filename passed in.
* Based on the bit count, this method will either call the 8 or 24 bit
* bitmap reader methods
*
* #param file The name of the bitmap file to read
* #throws IOException
* #return A BufferedImage of the bitmap
*/
public static BufferedImage loadBitmap(String file) throws IOException {
BufferedImage image;
InputStream input = null;
try {
input = ResourceRetriever.getResourceAsStream(file);
int bitmapFileHeaderLength = 14;
int bitmapInfoHeaderLength = 40;
byte bitmapFileHeader[] = new byte[bitmapFileHeaderLength];
byte bitmapInfoHeader[] = new byte[bitmapInfoHeaderLength];
input.read(bitmapFileHeader, 0, bitmapFileHeaderLength);
input.read(bitmapInfoHeader, 0, bitmapInfoHeaderLength);
int nSize = bytesToInt(bitmapFileHeader, 2);
int nWidth = bytesToInt(bitmapInfoHeader, 4);
int nHeight = bytesToInt(bitmapInfoHeader, 8);
int nBiSize = bytesToInt(bitmapInfoHeader, 0);
int nPlanes = bytesToShort(bitmapInfoHeader, 12);
int nBitCount = bytesToShort(bitmapInfoHeader, 14);
int nSizeImage = bytesToInt(bitmapInfoHeader, 20);
int nCompression = bytesToInt(bitmapInfoHeader, 16);
int nColoursUsed = bytesToInt(bitmapInfoHeader, 32);
int nXPixelsMeter = bytesToInt(bitmapInfoHeader, 24);
int nYPixelsMeter = bytesToInt(bitmapInfoHeader, 28);
int nImportantColours = bytesToInt(bitmapInfoHeader, 36);
if (nBitCount == 24) {
image = read24BitBitmap(nSizeImage, nHeight, nWidth, input);
} else if (nBitCount == 8) {
image = read8BitBitmap(nColoursUsed, nBitCount, nSizeImage, nWidth, nHeight, input);
} else {
System.out.println("Not a 24-bit or 8-bit Windows Bitmap, aborting...");
image = null;
}
} finally {
try {
if (input != null)
input.close();
} catch (IOException e) {
}
}
return image;
}
/**
* Static method to read a 8 bit bitmap
*
* #param nColoursUsed Number of colors used
* #param nBitCount The bit count
* #param nSizeImage The size of the image in bytes
* #param nWidth The width of the image
* #param input The input stream corresponding to the image
* #throws IOException
* #return A BufferedImage of the bitmap
*/
private static BufferedImage read8BitBitmap(int nColoursUsed, int nBitCount, int nSizeImage, int nWidth, int nHeight, InputStream input) throws IOException {
int nNumColors = (nColoursUsed > 0) ? nColoursUsed : (1 & 0xff) << nBitCount;
if (nSizeImage == 0) {
nSizeImage = ((((nWidth * nBitCount) + 31) & ~31) >> 3);
nSizeImage *= nHeight;
}
int npalette[] = new int[nNumColors];
byte bpalette[] = new byte[nNumColors * 4];
readBuffer(input, bpalette);
int nindex8 = 0;
for (int n = 0; n < nNumColors; n++) {
npalette[n] = (255 & 0xff) << 24 |
(bpalette[nindex8 + 2] & 0xff) << 16 |
(bpalette[nindex8 + 1] & 0xff) << 8 |
(bpalette[nindex8 + 0] & 0xff);
nindex8 += 4;
}
int npad8 = (nSizeImage / nHeight) - nWidth;
BufferedImage bufferedImage = new BufferedImage(nWidth, nHeight, BufferedImage.TYPE_INT_ARGB);
DataBufferInt dataBufferByte = ((DataBufferInt) bufferedImage.getRaster().getDataBuffer());
int[][] bankData = dataBufferByte.getBankData();
byte bdata[] = new byte[(nWidth + npad8) * nHeight];
readBuffer(input, bdata);
nindex8 = 0;
for (int j8 = nHeight - 1; j8 >= 0; j8--) {
for (int i8 = 0; i8 < nWidth; i8++) {
bankData[0][j8 * nWidth + i8] = npalette[((int) bdata[nindex8] & 0xff)];
nindex8++;
}
nindex8 += npad8;
}
return bufferedImage;
}
/**
* Static method to read a 24 bit bitmap
*
* #param nSizeImage size of the image in bytes
* #param nHeight The height of the image
* #param nWidth The width of the image
* #param input The input stream corresponding to the image
* #throws IOException
* #return A BufferedImage of the bitmap
*/
private static BufferedImage read24BitBitmap(int nSizeImage, int nHeight, int nWidth, InputStream input) throws IOException {
int npad = (nSizeImage / nHeight) - nWidth * 3;
if (npad == 4 || npad < 0)
npad = 0;
int nindex = 0;
BufferedImage bufferedImage = new BufferedImage(nWidth, nHeight, BufferedImage.TYPE_4BYTE_ABGR);
DataBufferByte dataBufferByte = ((DataBufferByte) bufferedImage.getRaster().getDataBuffer());
byte[][] bankData = dataBufferByte.getBankData();
byte brgb[] = new byte[(nWidth + npad) * 3 * nHeight];
readBuffer(input, brgb);
for (int j = nHeight - 1; j >= 0; j--) {
for (int i = 0; i < nWidth; i++) {
int base = (j * nWidth + i) * 4;
bankData[0][base] = (byte) 255;
bankData[0][base + 1] = brgb[nindex];
bankData[0][base + 2] = brgb[nindex + 1];
bankData[0][base + 3] = brgb[nindex + 2];
nindex += 3;
}
nindex += npad;
}
return bufferedImage;
}
/**
* Converts bytes to an int
*
* #param bytes An array of bytes
* #param index
* #returns A int representation of the bytes
*/
private static int bytesToInt(byte[] bytes, int index) {
return (bytes[index + 3] & 0xff) << 24 |
(bytes[index + 2] & 0xff) << 16 |
(bytes[index + 1] & 0xff) << 8 |
bytes[index + 0] & 0xff;
}
/**
* Converts bytes to a short
*
* #param bytes An array of bytes
* #param index
* #returns A short representation of the bytes
*/
private static short bytesToShort(byte[] bytes, int index) {
return (short) (((bytes[index + 1] & 0xff) << 8) |
(bytes[index + 0] & 0xff));
}
/**
* Reads the buffer
*
* #param in An InputStream
* #param buffer An array of bytes
* #throws IOException
*/
private static void readBuffer(InputStream in, byte[] buffer) throws IOException {
int bytesRead = 0;
int bytesToRead = buffer.length;
while (bytesToRead > 0) {
int read = in.read(buffer, bytesRead, bytesToRead);
bytesRead += read;
bytesToRead -= read;
}
}
}

If the minimum version of you application Android SDK is greater or equal to 16 (Android 4.1) the best way of video encoding is use Android Media Codec API.
From Android 4.3 APIs.
When encoding video, Android 4.1 (SDK 16) required that you provide
the media with a ByteBuffer array, but Android 4.3 (SDK 18) now allows
you to use a Surface as the input to an encoder. For instance, this
allows you to encode input from an existing video file or using frames
generated from OpenGL ES.
Media Muxer added in Android 4.3 (SDK 18) so for convenient way of writing mp4 file with Media Muxer you should have SDK>=18.
Using Media Codec API way you will get hardware accelerated encoding and you are easily encode up to 60 FPS.
You can start from 1) How to encode Bitmaps into a video using MediaCodec?
or use 2) Google Grafika or 3) Bigflake.
Starting from Grafika RecordFBOActivity.java. Replace Choreographer event with you own containing bitmap to encode, remove On Screen drawing, load you bitmap as Open GL Texture and draw it on Media Codec Input Surface.

jCodec has added Android support.
You need to add these to your gradle...
implementation 'org.jcodec:jcodec:0.2.3'
implementation 'org.jcodec:jcodec-android:0.2.3'
...and
android {
...
configurations.all {
resolutionStrategy.force 'com.google.code.findbugs:jsr305:3.0.2'
}
}
I can confirm this works as expected, but with caveats. First being I tried some full size images and the file wrote, but gave an error on playback. When I scaled down, I would get an error if the width or height of the image was not even because it requires a multiple of 2 for YUV420J colorspace.
Also worthy of note, this makes your package HEAVY, heavy. My small project went over the dex limit by adding this and required enabling multidex.
FileChannelWrapper out = null;
File dir = what ever directory you use...
File file = new File(dir, "test.mp4");
try { out = NIOUtils.writableFileChannel(file.getAbsolutePath());
AndroidSequenceEncoder encoder = new AndroidSequenceEncoder(out, Rational.R(15, 1));
for (Bitmap bitmap : bitmaps) {
encoder.encodeImage(bitmap);
}
encoder.finish();
} finally {
NIOUtils.closeQuietly(out);
}

You can use Bitmp4 to convert sequence of images to MP4 file.
Sample code :
...
val encoder = MP4Encoder()
encoder.setFrameDelay(50)
encoder.setOutputFilePath(exportedFile.path)
encoder.setOutputSize(width, width)
startExport()
stopExport()
addFrame(bitmap) //called intervally
It's a java library so it's easy to import it into Android project, you don't have to use NDK unlike ffmpeg.
Refer https://github.com/dbof10/Bitmp4 for sample code & downloads.

I created a project that should be able to handle this. The code is light and fairly straight forward.
https://github.com/dburckh/bitmap2video

Abhishek V was right, more information about jcodec SequenceEncoder:
see Android make animated video from list of images
Recently I have built a real-time video system using raspberry pi and Android devices, met the same problem as yours. Instead of saving a list of image files, I used some real-time streaming protocols like RTP/RTCP to transfer data stream to user. If your requirement is something like this, maybe you could change your strategies.
Another suggestion is that you may explore some C/C++ libraries, using NDK/JNI to break the limitation of Java.
Hope the suggestions make sense to you :)

Related

Android Creating Video from Screen Scraping: Why is output Image wonky?

Update #6 Discovered I was accessing RGB values improperly. I assumed I was accessing data from an Int[], but was instead accessing byte information from a Byte[]. Changed to accessing from Int[] and get the following image:
Update #5 Adding code used to get RGBA ByteBuffer for reference
private void screenScrape() {
Log.d(TAG, "In screenScrape");
//read pixels from frame buffer into PBO (GL_PIXEL_PACK_BUFFER)
mSurface.queueEvent(new Runnable() {
#Override
public void run() {
Log.d(TAG, "In Screen Scrape 1");
//generate and bind buffer ID
GLES30.glGenBuffers(1, pboIds);
checkGlError("Gen Buffers");
GLES30.glBindBuffer(GLES30.GL_PIXEL_PACK_BUFFER, pboIds.get(0));
checkGlError("Bind Buffers");
//creates and initializes data store for PBO. Any pre-existing data store is deleted
GLES30.glBufferData(GLES30.GL_PIXEL_PACK_BUFFER, (mWidth * mHeight * 4), null, GLES30.GL_STATIC_READ);
checkGlError("Buffer Data");
//glReadPixelsPBO(0,0,w,h,GLES30.GL_RGB,GLES30.GL_UNSIGNED_SHORT_5_6_5,0);
glReadPixelsPBO(0, 0, mWidth, mHeight, GLES30.GL_RGBA, GLES30.GL_UNSIGNED_BYTE, 0);
checkGlError("Read Pixels");
//GLES30.glReadPixels(0,0,w,h,GLES30.GL_RGBA,GLES30.GL_UNSIGNED_BYTE,intBuffer);
}
});
//map PBO data into client address space
mSurface.queueEvent(new Runnable() {
#Override
public void run() {
Log.d(TAG, "In Screen Scrape 2");
//read pixels from PBO into a byte buffer for processing. Unmap buffer for use in next pass
mapBuffer = ((ByteBuffer) GLES30.glMapBufferRange(GLES30.GL_PIXEL_PACK_BUFFER, 0, 4 * mWidth * mHeight, GLES30.GL_MAP_READ_BIT)).order(ByteOrder.nativeOrder());
checkGlError("Map Buffer");
GLES30.glUnmapBuffer(GLES30.GL_PIXEL_PACK_BUFFER);
checkGlError("Unmap Buffer");
isByteBufferEmpty(mapBuffer, "MAP BUFFER");
convertColorSpaceByteArray(mapBuffer);
mapBuffer.clear();
}
});
}
Update #4 For reference, here is the original image to compare against.
Update #3 This is the output image after interleaving all U/V data into a single array and passing it to the Image object at inputImagePlanes[1]; inputImagePlanes[2]; is unused;
The next image is the same interleaved UV data, but we load this into inputImagePlanes[2]; instead of inputImagePlanes[1];
Update #2 This is the output image after padding the U/V buffers with a zero in between each byte of 'real' data. uArray[uvByteIndex] = (byte) 0;
Update #1 As suggested by a comment, here are the row and pixel strides I get from calling getPixelStride and getRowStride
Y Plane Pixel Stride = 1, Row Stride = 960
U Plane Pixel Stride = 2, Row Stride = 960
V Plane Pixel Stride = 2, Row Stride = 960
The goal of my application is to read pixels out from the screen, compress them, and then send that h264 stream over WiFi to be played be a receiver.
Currently I'm using the MediaMuxer class to convert the raw h264 stream to an MP4, and then save it to file. However the end result video is messed up and I can't figure out why. Lets walk through some of processing and see if we can find anything that jumps out.
Step 1 Set up the encoder. I'm currently taking screen images once every 2 seconds, and using "video/avc" for MIME_TYPE
//create codec for compression
try {
mCodec = MediaCodec.createEncoderByType(MIME_TYPE);
} catch (IOException e) {
Log.d(TAG, "FAILED: Initializing Media Codec");
}
//set up format for codec
MediaFormat mFormat = MediaFormat.createVideoFormat(MIME_TYPE, mWidth, mHeight);
mFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Flexible);
mFormat.setInteger(MediaFormat.KEY_BIT_RATE, 16000000);
mFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 1/2);
mFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
Step 2 Read pixels out from screen. This is done using openGL ES, and the pixels are read out in RGBA format. (I've confirmed this part to be working)
Step 3 Convert the RGBA pixels to YUV420 (IYUV) format. This is done using the following method. Note that I have 2 methods for encoding called at the end of this method.
private void convertColorSpaceByteArray(ByteBuffer rgbBuffer) {
long startTime = System.currentTimeMillis();
Log.d(TAG, "In convertColorspace");
final int frameSize = mWidth * mHeight;
final int chromaSize = frameSize / 4;
byte[] rgbByteArray = new byte[rgbBuffer.remaining()];
rgbBuffer.get(rgbByteArray);
byte[] yuvByteArray = new byte[inputBufferSize];
Log.d(TAG, "Input Buffer size = " + inputBufferSize);
byte[] yArray = new byte[frameSize];
byte[] uArray = new byte[(frameSize / 4)];
byte[] vArray = new byte[(frameSize / 4)];
isByteBufferEmpty(rgbBuffer, "RGB BUFFER");
int yIndex = 0;
int uIndex = frameSize;
int vIndex = frameSize + chromaSize;
int yByteIndex = 0;
int uvByteIndex = 0;
int R, G, B, Y, U, V;
int index = 0;
//this loop controls the rows
for (int i = 0; i < mHeight; i++) {
//this loop controls the columns
for (int j = 0; j < mWidth; j++) {
R = (rgbByteArray[index] & 0xff0000) >> 16;
G = (rgbByteArray[index] & 0xff00) >> 8;
B = (rgbByteArray[index] & 0xff);
Y = ((66 * R + 129 * G + 25 * B + 128) >> 8) + 16;
U = ((-38 * R - 74 * G + 112 * B + 128) >> 8) + 128;
V = ((112 * R - 94 * G - 18 * B + 128) >> 8) + 128;
//clamp and load in the Y data
yuvByteArray[yIndex++] = (byte) ((Y < 16) ? 16 : ((Y > 235) ? 235 : Y));
yArray[yByteIndex] = (byte) ((Y < 16) ? 16 : ((Y > 235) ? 235 : Y));
yByteIndex++;
if (i % 2 == 0 && index % 2 == 0) {
//clamp and load in the U & V data
yuvByteArray[uIndex++] = (byte) ((U < 16) ? 16 : ((U > 239) ? 239 : U));
yuvByteArray[vIndex++] = (byte) ((V < 16) ? 16 : ((V > 239) ? 239 : V));
uArray[uvByteIndex] = (byte) ((U < 16) ? 16 : ((U > 239) ? 239 : U));
vArray[uvByteIndex] = (byte) ((V < 16) ? 16 : ((V > 239) ? 239 : V));
uvByteIndex++;
}
index++;
}
}
encodeVideoFromImage(yArray, uArray, vArray);
encodeVideoFromBuffer(yuvByteArray);
}
Step 4 Encode the data! I currently have two different ways of doing this, and each has a different output. One uses a ByteBuffer returned from MediaCodec.getInputBuffer();, the other uses an Image returned from MediaCodec.getInputImage();
Encoding using ByteBuffer
private void encodeVideoFromBuffer(byte[] yuvData) {
Log.d(TAG, "In encodeVideo");
int inputSize = 0;
//create index for input buffer
inputBufferIndex = mCodec.dequeueInputBuffer(0);
//create the input buffer for submission to encoder
ByteBuffer inputBuffer = mCodec.getInputBuffer(inputBufferIndex);
//clear, then copy yuv buffer into the input buffer
inputBuffer.clear();
inputBuffer.put(yuvData);
//flip buffer before reading data out of it
inputBuffer.flip();
mCodec.queueInputBuffer(inputBufferIndex, 0, inputBuffer.remaining(), presentationTime, 0);
presentationTime += MICROSECONDS_BETWEEN_FRAMES;
sendToWifi();
}
And the associated output image (note: I took a screenshot of the MP4)
Encoding using Image
private void encodeVideoFromImage(byte[] yToEncode, byte[] uToEncode, byte[]vToEncode) {
Log.d(TAG, "In encodeVideo");
int inputSize = 0;
//create index for input buffer
inputBufferIndex = mCodec.dequeueInputBuffer(0);
//create the input buffer for submission to encoder
Image inputImage = mCodec.getInputImage(inputBufferIndex);
Image.Plane[] inputImagePlanes = inputImage.getPlanes();
ByteBuffer yPlaneBuffer = inputImagePlanes[0].getBuffer();
ByteBuffer uPlaneBuffer = inputImagePlanes[1].getBuffer();
ByteBuffer vPlaneBuffer = inputImagePlanes[2].getBuffer();
yPlaneBuffer.put(yToEncode);
uPlaneBuffer.put(uToEncode);
vPlaneBuffer.put(vToEncode);
yPlaneBuffer.flip();
uPlaneBuffer.flip();
vPlaneBuffer.flip();
mCodec.queueInputBuffer(inputBufferIndex, 0, inputBufferSize, presentationTime, 0);
presentationTime += MICROSECONDS_BETWEEN_FRAMES;
sendToWifi();
}
And the associated output image (note: I took a screenshot of the MP4)
Step 5 Convert H264 Stream to MP4. Finally I grab the output buffer from the codec, and use MediaMuxer to convert the raw h264 stream to an MP4 that I can play and test for correctness
private void sendToWifi() {
Log.d(TAG, "In sendToWifi");
MediaCodec.BufferInfo mBufferInfo = new MediaCodec.BufferInfo();
//Check to see if encoder has output before proceeding
boolean waitingForOutput = true;
boolean outputHasChanged = false;
int outputBufferIndex = 0;
while (waitingForOutput) {
//access the output buffer from the codec
outputBufferIndex = mCodec.dequeueOutputBuffer(mBufferInfo, -1);
if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
outputFormat = mCodec.getOutputFormat();
outputHasChanged = true;
Log.d(TAG, "OUTPUT FORMAT HAS CHANGED");
}
if (outputBufferIndex >= 0) {
waitingForOutput = false;
}
}
//this buffer now contains the compressed YUV data, ready to be sent over WiFi
ByteBuffer outputBuffer = mCodec.getOutputBuffer(outputBufferIndex);
//adjust output buffer position and limit. As of API 19, this is not automatic
if(mBufferInfo.size != 0) {
outputBuffer.position(mBufferInfo.offset);
outputBuffer.limit(mBufferInfo.offset + mBufferInfo.size);
}
////////////////////////////////FOR DEGBUG/////////////////////////////
if (muxerNotStarted && outputHasChanged) {
//set up track
mTrackIndex = mMuxer.addTrack(outputFormat);
mMuxer.start();
muxerNotStarted = false;
}
if (!muxerNotStarted) {
mMuxer.writeSampleData(mTrackIndex, outputBuffer, mBufferInfo);
}
////////////////////////////END DEBUG//////////////////////////////////
//release the buffer
mCodec.releaseOutputBuffer(outputBufferIndex, false);
muxerPasses++;
}
If you've made it this far you're a gentleman (or lady!) and a scholar! Basically I'm stumped as to why my image is not coming out properly. I'm relatively new to video processing so I'm sure I'm just missing something.
If you're API 19+, might as well stick with encoding method #2, getImage()/encodeVideoFromImage(), since that is more modern.
Focusing on that method: One problem was, you had an unexpected image format. With COLOR_FormatYUV420Flexible, you know you're going to have 8-bit U and V components, but you won't know in advance where they go. That's why you have to query the Image.Plane formats. Could be different on every device.
In this case, the UV format turned out to be interleaved (very common on Android devices). If you're using Java, and you supply each array (U/V) separately, with the "stride" requested ("spacer" byte in-between each sample), I believe one array ends up clobbering the other, because these are actually "direct" ByteBuffers, and they were intended to be used from native code, like in this answer. The solution I explained was to copy an interleaved array into the third (V) plane, and ignore the U plane. On the native side, these two planes actually overlap each other in memory (except for the first and last byte), so filling one causes the implementation to fill both.
If you use the second (U) plane instead, you'll find things work, but the colors look funny. That's also because of the overlapping arrangement of these two planes; what that does, effectively, is shift every array element by one byte (which puts U's where V's should be, and vice versa.)
...In other words, this solution is actually a bit of a hack. Probably the only way to do this correctly, and have it work on all devices, is to use native code (as in the answer I linked above).
Once the color plane problem is fixed, that leaves all the funny overlapping text and vertical striations. These were actually caused by your interpretation of the RGB data, which had the wrong stride.
And, once that is fixed, you have a decent-looking picture. It's been mirrored vertically; I don't know the root cause of that, but I suspect it's an OpenGL issue.

Show waveform of audio

I am making one music application in android.In this music list coming from server side. I don'tknow how to show waveform of audio in android ? like in soundcloud website. I have attached image below.
Perhaps, you can implements this feature without libraries, of course if you want only visualisation of audio sample.
For example:
public class PlayerVisualizerView extends View {
/**
* constant value for Height of the bar
*/
public static final int VISUALIZER_HEIGHT = 28;
/**
* bytes array converted from file.
*/
private byte[] bytes;
/**
* Percentage of audio sample scale
* Should updated dynamically while audioPlayer is played
*/
private float denseness;
/**
* Canvas painting for sample scale, filling played part of audio sample
*/
private Paint playedStatePainting = new Paint();
/**
* Canvas painting for sample scale, filling not played part of audio sample
*/
private Paint notPlayedStatePainting = new Paint();
private int width;
private int height;
public PlayerVisualizerView(Context context) {
super(context);
init();
}
public PlayerVisualizerView(Context context, #Nullable AttributeSet attrs) {
super(context, attrs);
init();
}
private void init() {
bytes = null;
playedStatePainting.setStrokeWidth(1f);
playedStatePainting.setAntiAlias(true);
playedStatePainting.setColor(ContextCompat.getColor(getContext(), R.color.gray));
notPlayedStatePainting.setStrokeWidth(1f);
notPlayedStatePainting.setAntiAlias(true);
notPlayedStatePainting.setColor(ContextCompat.getColor(getContext(), R.color.colorAccent));
}
/**
* update and redraw Visualizer view
*/
public void updateVisualizer(byte[] bytes) {
this.bytes = bytes;
invalidate();
}
/**
* Update player percent. 0 - file not played, 1 - full played
*
* #param percent
*/
public void updatePlayerPercent(float percent) {
denseness = (int) Math.ceil(width * percent);
if (denseness < 0) {
denseness = 0;
} else if (denseness > width) {
denseness = width;
}
invalidate();
}
#Override
protected void onLayout(boolean changed, int left, int top, int right, int bottom) {
super.onLayout(changed, left, top, right, bottom);
width = getMeasuredWidth();
height = getMeasuredHeight();
}
#Override
protected void onDraw(Canvas canvas) {
super.onDraw(canvas);
if (bytes == null || width == 0) {
return;
}
float totalBarsCount = width / dp(3);
if (totalBarsCount <= 0.1f) {
return;
}
byte value;
int samplesCount = (bytes.length * 8 / 5);
float samplesPerBar = samplesCount / totalBarsCount;
float barCounter = 0;
int nextBarNum = 0;
int y = (height - dp(VISUALIZER_HEIGHT)) / 2;
int barNum = 0;
int lastBarNum;
int drawBarCount;
for (int a = 0; a < samplesCount; a++) {
if (a != nextBarNum) {
continue;
}
drawBarCount = 0;
lastBarNum = nextBarNum;
while (lastBarNum == nextBarNum) {
barCounter += samplesPerBar;
nextBarNum = (int) barCounter;
drawBarCount++;
}
int bitPointer = a * 5;
int byteNum = bitPointer / Byte.SIZE;
int byteBitOffset = bitPointer - byteNum * Byte.SIZE;
int currentByteCount = Byte.SIZE - byteBitOffset;
int nextByteRest = 5 - currentByteCount;
value = (byte) ((bytes[byteNum] >> byteBitOffset) & ((2 << (Math.min(5, currentByteCount) - 1)) - 1));
if (nextByteRest > 0) {
value <<= nextByteRest;
value |= bytes[byteNum + 1] & ((2 << (nextByteRest - 1)) - 1);
}
for (int b = 0; b < drawBarCount; b++) {
int x = barNum * dp(3);
float left = x;
float top = y + dp(VISUALIZER_HEIGHT - Math.max(1, VISUALIZER_HEIGHT * value / 31.0f));
float right = x + dp(2);
float bottom = y + dp(VISUALIZER_HEIGHT);
if (x < denseness && x + dp(2) < denseness) {
canvas.drawRect(left, top, right, bottom, notPlayedStatePainting);
} else {
canvas.drawRect(left, top, right, bottom, playedStatePainting);
if (x < denseness) {
canvas.drawRect(left, top, right, bottom, notPlayedStatePainting);
}
}
barNum++;
}
}
}
public int dp(float value) {
if (value == 0) {
return 0;
}
return (int) Math.ceil(getContext().getResources().getDisplayMetrics().density * value);
}
}
Sorry, code with a small amount of comments, but it is working visualizer. You can attach it to any players you want.
How you can use it: add this view in your xml layout, then you have to update visualizer state with methods
public void updateVisualizer(byte[] bytes) {
playerVisualizerView.updateVisualizer(bytes);
}
public void updatePlayerProgress(float percent) {
playerVisualizerView.updatePlayerPercent(percent);
}
In updateVisualizer you pass bytes array with you audio sample, and in updatePlayerProgress you dynamically pass percentage, while audio sample is playing.
for converting file to bytes you can use this helper method
public static byte[] fileToBytes(File file) {
int size = (int) file.length();
byte[] bytes = new byte[size];
try {
BufferedInputStream buf = new BufferedInputStream(new FileInputStream(file));
buf.read(bytes, 0, bytes.length);
buf.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
return bytes;
}
and for example(very shortly), how it looks like with Mosby library:
public class AudioRecorderPresenter extends MvpBasePresenter<AudioRecorderView> {
public void onStopRecord() {
// stopped and released MediaPlayer
// ...
// some preparation and saved audio file in audioFileName variable.
getView().updateVisualizer(FileUtils.fileToBytes(new File(audioFileName)));
}
}
}
UPD: I created the library for resolving this case github.com/scrobot/SoundWaveView. It still in status "WIP"(work in progress), but soon I will complete it.
I believe Scrobot's answer does not work. It assumes the input audio to be in a certain (quite peculiar) encoding (single-channel/mono linear PCM with 5 bit depth). And the algorithm to calculate amplitudes from the wave function is probably flawed. If you use that algorithm with any commonly used audio file format, you will get nothing but random data.
The truth is: It's just a bit more complicated that that.
Here's what's there to be done to achieve the OP's goal:
Use Android's MediaExtractor to read the input audio file (variousformats/encodings are supported)
Use Android's MediaCodec to decode the input audio encoding to a linear PCM encoding with certain bit depth (usually 16 bit)
Only after this step you got a byte array which you can linearly read and calculate amplitudes from.
Apply a loudness measure to the PCM-encoded data. There are many of them, some more complicated (e.g. LUFS/LKFS), some more basic (RMS). Let's take RMS (= Root Mean Squares) for example:
Determine the number of samples per bar.
Read all the samples for a single bar. Usually there are 2 channels, so for each sample you will get 2 short ints (16 bit) for PCM-16.
For each sample calculate the mean of all channels.
Maybe you will want to normalize the value in some way, e.g. to get float values between -1 and 1 you can divide by (2 / (1 << 16))
Square each sample (hence the "S" in RMS)
Calculate the mean of all the samples for a bar (hence the "M" in RMS)
Calculate the square root of the resulting value (hence the "R" in RMS)
Now you get a value which you can base the height of the bar on.
Repeat steps 2-8 for all the bars.
To implement this is quite an involved task. I could not find any library providing this whole process already. But at least Android's media API provides the algorithms for reading audio file in any format.
Note: RMS is considered a not very accurate loudness measure. But it seems to yield results which are at least somewhat related to what you can actually hear. For many applications it should be good enough.
JETPACK COMPOSE
AudioWaveform is a lightweight Jetpack Compose library that draws a waveform of audio.
XML
WaveformSeekBar is an android library that draws a waveform from a local audio file, resource, and URL using android.view.View (XML approach).
AUDIO PROCESSING
If you're looking for a fast audio processing library, you could use the existing Amplituda library. Amplituda also has caching and compressing features out of the box.

Android make animated video from list of images

I want to make animated video from list of images by applying transition animation between two images. I found many similar questions on SO like,
Android Screen capturing or make video from images
Android- How to make video using set of images from sd card?
All similar SO questions suggest to used animation for that, but how can we store that animated images to video file? Is there any Android library support this facility to make video of images?
Android do not support for AWT's BufferedBitmap nor AWTUtil, that is for Java SE. Currently the solution with SequenceEncoder has been integrated into jcodec's Android version. You can use it from package org.jcodec.api.SequenceEncoder.
Here is the solution for generating MP4 file from series of Bitmaps using jcodec:
try {
File file = this.GetSDPathToFile("", "output.mp4");
SequenceEncoder encoder = new SequenceEncoder(file);
// only 5 frames in total
for (int i = 1; i <= 5; i++) {
// getting bitmap from drawable path
int bitmapResId = this.getResources().getIdentifier("image" + i, "drawable", this.getPackageName());
Bitmap bitmap = this.getBitmapFromResources(this.getResources(), bitmapResId);
encoder.encodeNativeFrame(this.fromBitmap(bitmap));
}
encoder.finish();
} catch (IOException e) {
e.printStackTrace();
}
// get full SD path
File GetSDPathToFile(String filePatho, String fileName) {
File extBaseDir = Environment.getExternalStorageDirectory();
if (filePatho == null || filePatho.length() == 0 || filePatho.charAt(0) != '/')
filePatho = "/" + filePatho;
makeDirectory(filePatho);
File file = new File(extBaseDir.getAbsoluteFile() + filePatho);
return new File(file.getAbsolutePath() + "/" + fileName);// file;
}
// convert from Bitmap to Picture (jcodec native structure)
public Picture fromBitmap(Bitmap src) {
Picture dst = Picture.create((int)src.getWidth(), (int)src.getHeight(), ColorSpace.RGB);
fromBitmap(src, dst);
return dst;
}
public void fromBitmap(Bitmap src, Picture dst) {
int[] dstData = dst.getPlaneData(0);
int[] packed = new int[src.getWidth() * src.getHeight()];
src.getPixels(packed, 0, src.getWidth(), 0, 0, src.getWidth(), src.getHeight());
for (int i = 0, srcOff = 0, dstOff = 0; i < src.getHeight(); i++) {
for (int j = 0; j < src.getWidth(); j++, srcOff++, dstOff += 3) {
int rgb = packed[srcOff];
dstData[dstOff] = (rgb >> 16) & 0xff;
dstData[dstOff + 1] = (rgb >> 8) & 0xff;
dstData[dstOff + 2] = rgb & 0xff;
}
}
}
In case you need to change the fps, you may customize the SequenceEncoder.
You can use a pure java solution called JCodec ( http://jcodec.org ). Here's a CORRECTED simple class that does it using JCodec low-level API:
public class SequenceEncoder {
private SeekableByteChannel ch;
private Picture toEncode;
private RgbToYuv420 transform;
private H264Encoder encoder;
private ArrayList<ByteBuffer> spsList;
private ArrayList<ByteBuffer> ppsList;
private CompressedTrack outTrack;
private ByteBuffer _out;
private int frameNo;
private MP4Muxer muxer;
public SequenceEncoder(File out) throws IOException {
this.ch = NIOUtils.writableFileChannel(out);
// Transform to convert between RGB and YUV
transform = new RgbToYuv420(0, 0);
// Muxer that will store the encoded frames
muxer = new MP4Muxer(ch, Brand.MP4);
// Add video track to muxer
outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, 25);
// Allocate a buffer big enough to hold output frames
_out = ByteBuffer.allocate(1920 * 1080 * 6);
// Create an instance of encoder
encoder = new H264Encoder();
// Encoder extra data ( SPS, PPS ) to be stored in a special place of
// MP4
spsList = new ArrayList<ByteBuffer>();
ppsList = new ArrayList<ByteBuffer>();
}
public void encodeImage(BufferedImage bi) throws IOException {
if (toEncode == null) {
toEncode = Picture.create(bi.getWidth(), bi.getHeight(), ColorSpace.YUV420);
}
// Perform conversion
for (int i = 0; i < 3; i++)
Arrays.fill(toEncode.getData()[i], 0);
transform.transform(AWTUtil.fromBufferedImage(bi), toEncode);
// Encode image into H.264 frame, the result is stored in '_out' buffer
_out.clear();
ByteBuffer result = encoder.encodeFrame(_out, toEncode);
// Based on the frame above form correct MP4 packet
spsList.clear();
ppsList.clear();
H264Utils.encodeMOVPacket(result, spsList, ppsList);
// Add packet to video track
outTrack.addFrame(new MP4Packet(result, frameNo, 25, 1, frameNo, true, null, frameNo, 0));
frameNo++;
}
public void finish() throws IOException {
// Push saved SPS/PPS to a special storage in MP4
outTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList, ppsList));
// Write MP4 header and finalize recording
muxer.writeHeader();
NIOUtils.closeQuietly(ch);
}
public static void main(String[] args) throws IOException {
SequenceEncoder encoder = new SequenceEncoder(new File("video.mp4"));
for (int i = 1; i < 100; i++) {
BufferedImage bi = ImageIO.read(new File(String.format("folder/img%08d.png", i)));
encoder.encodeImage(bi);
}
encoder.finish();
}
}

How to actually see a Bitmap taken from an Android heap dump

In the process of tracking severe memory issues in my app, I looked at several heap dumps from my app, and most of the time I have a HUGE bitmap that I don't know of.
It takes 9.4MB, or 9,830,400 bytes, or actually a 1280x1920 image at 4 bytes per pixels.
I checked in Eclipse MAT, it is indeed a byte[9830400], that has one incoming reference which is a android.graphics.Bitmap.
I'd like to dump this to a file and try to see it. I can't understand where is it coming from. My biggest image in all my drawables is a 640x960 png, which takes less than 3MB.
I tried to use Eclipse to "copy value to file", but I think it simply prints the buffer to the file, and I don't know any image software that can read a stream of bytes and display it as a 4 bytes per pixel image.
Any idea?
Here's what I tried: dump the byte array to a file, push it to /sdcard/img, and load an activity like this:
#Override
public void onCreate(final Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
try {
final File inputFile = new File("/sdcard/img");
final FileInputStream isr = new FileInputStream(inputFile);
final Bitmap bmp = BitmapFactory.decodeStream(isr);
ImageView iv = new ImageView(this);
iv.setImageBitmap(bmp);
setContentView(iv);
Log.d("ImageTest", "Image was inflated");
} catch (final FileNotFoundException e) {
Log.d("ImageTest", "Image was not inflated");
}
}
I didn't see anything.
Do you know how is encoded the image? Say it is stored into byte[] buffer. buffer[0] is red, buffer[1] is green, etc?
See here for an easier answer: MAT (Eclipse Memory Analyzer) - how to view bitmaps from memory dump
TL;DR - Install GIMP and load the image as raw RGB Alpha
OK -- After quite some unsuccessful tries, I finally got something out of this byte array. I wrote this simple C program to convert the byte array to a Windows Bitmap file. I'm dropping the code in case somebody is interested.
I compiled this against VisualC 6.0 and gcc 3.4.4, it should work on any OS (tested on Windows, Linux and MacOS X).
#include <stdio.h>
#include <math.h>
#include <string.h>
#include <stdlib.h>
/* Types */
typedef unsigned char byte;
typedef unsigned short uint16_t;
typedef unsigned int uint32_t;
typedef int int32_t;
/* Constants */
#define RMASK 0x00ff0000
#define GMASK 0x0000ff00
#define BMASK 0x000000ff
#define AMASK 0xff000000
/* Structures */
struct bmpfile_magic {
unsigned char magic[2];
};
struct bmpfile_header {
uint32_t filesz;
uint16_t creator1;
uint16_t creator2;
uint32_t bmp_offset;
};
struct bmpfile_dibheader {
uint32_t header_sz;
uint32_t width;
uint32_t height;
uint16_t nplanes;
uint16_t bitspp;
uint32_t compress_type;
uint32_t bmp_bytesz;
int32_t hres;
int32_t vres;
uint32_t ncolors;
uint32_t nimpcolors;
uint32_t rmask, gmask, bmask, amask;
uint32_t colorspace_type;
byte colorspace[0x24];
uint32_t rgamma, ggamma, bgamma;
};
/* Displays usage info and exits */
void usage(char *cmd) {
printf("Usage:\t%s <img_src> <img_dest.bmp> <width> <height>\n"
"\timg_src:\timage byte buffer obtained from Eclipse MAT, using 'copy > save value to file' while selecting the byte[] buffer corresponding to an android.graphics.Bitmap\n"
"\timg_dest:\tpath to target *.bmp file\n"
"\twidth:\t\tpicture width, obtained in Eclipse MAT, selecting the android.graphics.Bitmap object and seeing the object member values\n"
"\theight:\t\tpicture height\n\n", cmd);
exit(1);
}
/* C entry point */
int main(int argc, char **argv) {
FILE *in, *out;
char *file_in, *file_out;
int w, h, W, H;
byte r, g, b, a, *image;
struct bmpfile_magic magic;
struct bmpfile_header header;
struct bmpfile_dibheader dibheader;
/* Parse command line */
if (argc < 5) {
usage(argv[0]);
}
file_in = argv[1];
file_out = argv[2];
W = atoi(argv[3]);
H = atoi(argv[4]);
in = fopen(file_in, "rb");
out = fopen(file_out, "wb");
/* Check parameters */
if (in == NULL || out == NULL || W == 0 || H == 0) {
usage(argv[0]);
}
/* Init BMP headers */
magic.magic[0] = 'B';
magic.magic[1] = 'M';
header.filesz = W * H * 4 + sizeof(magic) + sizeof(header) + sizeof(dibheader);
header.creator1 = 0;
header.creator2 = 0;
header.bmp_offset = sizeof(magic) + sizeof(header) + sizeof(dibheader);
dibheader.header_sz = sizeof(dibheader);
dibheader.width = W;
dibheader.height = H;
dibheader.nplanes = 1;
dibheader.bitspp = 32;
dibheader.compress_type = 3;
dibheader.bmp_bytesz = W * H * 4;
dibheader.hres = 2835;
dibheader.vres = 2835;
dibheader.ncolors = 0;
dibheader.nimpcolors = 0;
dibheader.rmask = RMASK;
dibheader.gmask = BMASK;
dibheader.bmask = GMASK;
dibheader.amask = AMASK;
dibheader.colorspace_type = 0x57696e20;
memset(&dibheader.colorspace, 0, sizeof(dibheader.colorspace));
dibheader.rgamma = dibheader.bgamma = dibheader.ggamma = 0;
/* Read picture data */
image = (byte*) malloc(4*W*H);
if (image == NULL) {
printf("Could not allocate a %d-byte buffer.\n", 4*W*H);
exit(1);
}
fread(image, 4*W*H, sizeof(byte), in);
fclose(in);
/* Write header */
fwrite(&magic, sizeof(magic), 1, out);
fwrite(&header, sizeof(header), 1, out);
fwrite(&dibheader, sizeof(dibheader), 1, out);
/* Convert the byte array to BMP format */
for (h = H-1; h >= 0; h--) {
for (w = 0; w < W; w++) {
r = *(image + w*4 + 4 * W * h);
b = *(image + w*4 + 4 * W * h + 1);
g = *(image + w*4 + 4 * W * h + 2);
a = *(image + w*4 + 4 * W * h + 3);
fwrite(&b, 1, 1, out);
fwrite(&g, 1, 1, out);
fwrite(&r, 1, 1, out);
fwrite(&a, 1, 1, out);
}
}
free(image);
fclose(out);
}
So using this tool I was able to recognise the picture used to generate this 1280x1920 bitmap.
I found that starting from latest version of Android Studio (2.2.2 as of writing), you can view the bitmap file directly:
Open the ‘Android Monitor’ tab (at the bottom left) and then Memory tab.
Press the ‘Dump Java Heap’ button
Choose the ‘Bitmap’ Class Name for the current snapshot, select each Instance of bitmap and view what image exactly consume more memory than expected. (screens 4 and 5)
Choose the Bitmap class name…
Select each Instance of bitmap
and right click on it, select View Bitmap
Just take the input to the image and convert it into a bitmap object by using the fileinput stream/datastream. Also add logs for seeing data for each image that gets used.
You could enable an usb connection and copy the file to an other computer with more tools to investigate.
Some devices could be configured to dump the current screen to file system when the start button is pressed. Maybe this happens to you.

FFmpeg sample code for creating a video file from still images JNI Android

How i modify the following FFMPEG sample code for creating a video file from still images that i am having in my android phone. I am using JNI for invoking ffmpeg.
JNIEXPORT void JNICALL videoEncodeExample((JNIEnv *pEnv, jobject pObj, jstring filename)
{
AVCodec *codec;
AVCodecContext *c= NULL;
int i, out_size, size, x, y, outbuf_size;
FILE *f;
AVFrame *picture;
uint8_t *outbuf, *picture_buf;
printf("Video encoding\n");
/* find the mpeg1 video encoder */
codec = avcodec_find_encoder(CODEC_ID_MPEG1VIDEO);
if (!codec) {
fprintf(stderr, "codec not found\n");
exit(1);
}
c= avcodec_alloc_context();
picture= avcodec_alloc_frame();
/* put sample parameters */
c->bit_rate = 400000;
/* resolution must be a multiple of two */
c->width = 352;
c->height = 288;
/* frames per second */
c->time_base= (AVRational){1,25};
c->gop_size = 10; /* emit one intra frame every ten frames */
c->max_b_frames=1;
c->pix_fmt = PIX_FMT_YUV420P;
/* open it */
if (avcodec_open(c, codec) < 0) {
fprintf(stderr, "could not open codec\n");
exit(1);
}
f = fopen(filename, "wb");
if (!f) {
fprintf(stderr, "could not open %s\n", filename);
exit(1);
}
/* alloc image and output buffer */
outbuf_size = 100000;
outbuf = malloc(outbuf_size);
size = c->width * c->height;
picture_buf = malloc((size * 3) / 2); /* size for YUV 420 */
picture->data[0] = picture_buf;
picture->data[1] = picture->data[0] + size;
picture->data[2] = picture->data[1] + size / 4;
picture->linesize[0] = c->width;
picture->linesize[1] = c->width / 2;
picture->linesize[2] = c->width / 2;
/* encode 1 second of video */
for(i=0;i<25;i++) {
fflush(stdout);
/* prepare a dummy image */
/* Y */
for(y=0;y<c->height;y++) {
for(x=0;x<c->width;x++) {
picture->data[0][y * picture->linesize[0] + x] = x + y + i * 3;
}
}
/* Cb and Cr */
for(y=0;y<c->height/2;y++) {
for(x=0;x<c->width/2;x++) {
picture->data[1][y * picture->linesize[1] + x] = 128 + y + i * 2;
picture->data[2][y * picture->linesize[2] + x] = 64 + x + i * 5;
}
}
/* encode the image */
out_size = avcodec_encode_video(c, outbuf, outbuf_size, picture);
printf("encoding frame %3d (size=%5d)\n", i, out_size);
fwrite(outbuf, 1, out_size, f);
}
/* get the delayed frames */
for(; out_size; i++) {
fflush(stdout);
out_size = avcodec_encode_video(c, outbuf, outbuf_size, NULL);
printf("write frame %3d (size=%5d)\n", i, out_size);
fwrite(outbuf, 1, out_size, f);
}
/* add sequence end code to have a real mpeg file */
outbuf[0] = 0x00;
outbuf[1] = 0x00;
outbuf[2] = 0x01;
outbuf[3] = 0xb7;
fwrite(outbuf, 1, 4, f);
fclose(f);
free(picture_buf);
free(outbuf);
avcodec_close(c);
av_free(c);
av_free(picture);
printf("\n");
}
Thanks and Regards
Anish
In your sample code, the encoded image is dummy.
So, I think what you should do is to replace the dummy image to the actual images in your android device, and remember to convert its format to YUV420.
In your Code you are using image format as PIX_FMT_YUV420P. You need to convert your images to YUV420P as Android is using raw picture format YUV420SP (PIX_FMT_NV21) you still need to scale your images using libswscale library provided in ffmpeg. may be on of my answer will help you. check it. Converting YUV420SP to YUV420P

Categories

Resources