Statement has no effect 'AVPacket' - android

I am developing a Decoder using android NDK and FFmpeg native libraries. I have put Native Support for the project using Android Tools and I have the C code in videodecoder.cpp file. In the file the following function gives me this problem
JNIEXPORT jint Java_ssrp_android_ffmpegdecoder_H264Decoder_consumeNalUnitsFromDirectBuffer(
JNIEnv* env, jobject thiz, jobject nal_units, jint num_bytes,
jlong pkt_pts) {
DecoderContext *ctx = get_ctx(env, thiz);
void *buf = NULL;
if (nal_units == NULL) {
D("Received null buffer, sending empty packet to decoder");
} else {
buf = env->GetDirectBufferAddress(nal_units);
if (buf == NULL) {
D("Error getting direct buffer address");
return -1;
}
}
AVPacket packet = {.data = (uint8_t*) buf, .size = num_bytes, .pts = pkt_pts };
int frameFinished = 0;
int res = avcodec_decode_video2(ctx->codec_ctx, ctx->src_frame,&frameFinished, &packet);
if (frameFinished)
ctx->frame_ready = 1;
return res;
}
At the line AVPacket packet = {.data = (uint8_t*) buf, .size = num_bytes, .pts = pkt_pts };
It says that `Statement has no effect "AVPAcket" and
At the line int res = avcodec_decode_video2(ctx->codec_ctx, ctx->src_frame,&frameFinished, &packet);
It says that Invalid arguments '
Candidates are:
int avcodec_decode_video2(AVCodecContext *, AVFrame *, int *, const AVPacket *)'

The Problem is
AVPacket packet = {.data = (uint8_t*) buf, .size = num_bytes, .pts = pkt_pts }
as the Compiler does not understand the type / initialization.
This leads to the invalid argument error.
Maybe split the line into:
AVPacket packet;
packet.data = (uint8_t*) buf;
packet.size = num_bytes;
packet.pts = pkt_pts;
This should get more clear error output.

Related

Android swig call changed value of parameters

I have a swig wrapper for jni # ndk.
The function header is:
//
// Created by Tomasz on 03/11/2017.
//
#ifndef PC_ANDORID_APP_RESIZE_GIF_H
#define PC_ANDORID_APP_RESIZE_GIF_H
int Version();
int ResizeAnimation(const char * infile, const char * outfile);
#endif //PC_ANDORID_APP_RESIZE_GIF_H
The swig interface is simple as this:
%module GifResizer
%inline %{
#include "resize-gif.h"
extern int Version();
extern int ResizeAnimation(const char * infile, const char * outfile);
%}
and the implementation of ResizeAnimation is:
int ResizeAnimation(const char * infile, const char * outfile) {
initialize();
/* ... */
return 0;
}
The problem is, that value of params in Swig generater wrapper:
SWIGEXPORT jint JNICALL Java_org_imagemagick_GifResizerJNI_ResizeAnimation(JNIEnv *jenv, jclass jcls, jstring jarg1, jstring jarg2) {
jint jresult = 0 ;
char *arg1 = (char *) 0 ;
char *arg2 = (char *) 0 ;
int result;
(void)jenv;
(void)jcls;
arg1 = 0;
if (jarg1) {
arg1 = (char *)(*jenv)->GetStringUTFChars(jenv, jarg1, 0);
if (!arg1) return 0;
}
arg2 = 0;
if (jarg2) {
arg2 = (char *)(*jenv)->GetStringUTFChars(jenv, jarg2, 0);
if (!arg2) return 0;
}
result = (int)ResizeAnimation((char const *)arg1,(char const *)arg2);
jresult = (jint)result;
if (arg1) (*jenv)->ReleaseStringUTFChars(jenv, jarg1, (const char *)arg1);
if (arg2) (*jenv)->ReleaseStringUTFChars(jenv, jarg2, (const char *)arg2);
return jresult;
}
is okay and the arg1 and arg2 have proper values, but once ResizeAnimation is called, the pointers point to different memory address, and infile (arg1) is null, while outfile (arg2) is some random memory.
All the sources are built with standard android CMake for NDK.
The problem was caused by running x86_64 code on x86 emulator. Silly :)

env->NewStringUTF(s) get crash and different ABI version returning different string value

Want to generate random string of fix length from JNI function for that i have used below function
static const char alphanum[] =
"abcdefghijklmnopqrstuvwxyz";
jstring Utils::getRandomString(JNIEnv *env, const int len) {
char s[len];
for (int i = 0; i < len; ++i) {
int p = rand() % (sizeof(alphanum) - 1);
s[i] = alphanum[p];
}
s[len] = 0;
__android_log_print(ANDROID_LOG_DEBUG, "LOG_TAG", "getRandomString %s", s);
jstring temp= env->NewStringUTF(s);
return temp;
}
but application get crash for line jstring temp= env->NewStringUTF(s); only for ABI version armeabi-v7a
for the solution i have tried this solution
jstring Utils::getRandomString(JNIEnv *env, const int len) {
char s[len];
for (int i = 0; i < len; ++i) {
int p = rand() % (sizeof(alphanum) - 1);
s[i] = alphanum[p];
}
s[len] = 0;
__android_log_print(ANDROID_LOG_DEBUG, "LOG_TAG", "getRandomString 2 %s", s);
jbyteArray array = env->NewByteArray(len);
env->SetByteArrayRegion(array, 0, len, (const jbyte *) s);
jstring strEncode = env->NewStringUTF("UTF-8");
jclass cls = env->FindClass("java/lang/String");
jmethodID ctor = env->GetMethodID(cls, "<init>", "([BLjava/lang/String;)V");
jstring object = (jstring) env->NewObject(cls, ctor, array, strEncode);
__android_log_print(ANDROID_LOG_DEBUG, "LOG_TAG", "getRandomString 3 %s",
env->GetStringUTFChars(object, 0));
// jstring temp= env->NewStringUTF(s);
return object;
}
but it working fine for ABI version armeabi-v7a when we execute same code on ABI x86 it returning unexpected output like PKdhtXMmr18n2L9K�ؾ�����-DL
please provide some solution that return generated string. env->NewStringUTF() working for ABI version x86 it get crash on armeabi-v7a.
thanks in advance

Configure ffmpeg to use concat

This is my jni 'c' Code For Concatinate a list of mp3 files in sdcard using ffmpeg in Android
JNIEXPORT jint JNICALL Java_Test_Mp3_Merger_Audio_mergeAudio(JNIEnv *env,
jclass someclass, jstring inputFile, jstring outFile) {
log_message("Starting to trim video");
int numberOfArgs = 8;
char** arguments = calloc(numberOfArgs, sizeof(char*));
char start[5], duration[5];
const char *in, *out;
in = (*env)->GetStringUTFChars(env, inputFile, 0);
out = (*env)->GetStringUTFChars(env, outFile, 0);
//ffmpeg -f concat -i mergelist.txt -c copy a.mp3 // this comand worked pecrfect in my computer terminal..
arguments[0] = "ffmpeg";
arguments[1] = "-f";
arguments[2] = "concat";
arguments[3] = "-i";
arguments[4] = in;
arguments[5] = "-c";
arguments[6] = "copy";
arguments[7] = out;
int i;
for (i = 0; i < numberOfArgs; i++) {
log_message(arguments[i]);
}
log_message("Printed all");
ffmpeg_main(numberOfArgs, arguments);
log_message("Finished");
free(arguments);
(*env)->ReleaseStringUTFChars(env, inputFile, in);
(*env)->ReleaseStringUTFChars(env, outFile, out);
return 0;
}
The below is my config.h generated with config.mak
https://drive.google.com/file/d/0B4VBZ6KJJazSWHBqQWJpWHhuZnM/view?usp=sharing
And Whenever i call the native method trim()
i got the error message Unknown input format: 'concat'.
But the concat.c is showing as compiled when compiling with ndk
What is wrong ?
concat.c is the concat protocol. -f concat -i ... invokes the concat demuxer, which is FFmpeg/libavformat/concatdec.c

A weird error in an implementation of loadDex in android NDK

char* (*loadDex) (char * dexPath, char * odexPath,int flag) = NULL;
JNIEXPORT jint JNI_OnLoad(JavaVM* vm, void* reserved)
{
char* (*loadDex) (char *, char *,int) = NULL;
LOGD("JNI_OnLoad!");
void *ldvm = (void*) dlopen("/system/lib/libdvm.so", RTLD_LAZY);
if(ldvm == NULL)
{
LOGD("ERROR : %s",dlerror());
//is art
void *ldvm = (void*) dlopen("/system/lib/libart.so", RTLD_LAZY);
}
loadDex = (char* (*) (char *, char *,int)) dlsym (ldvm, "loadDex");
void *venv;
if ((*vm)->GetEnv(vm, (void**) &venv, JNI_VERSION_1_4) != JNI_OK)
{
return -1;
}
return JNI_VERSION_1_4;
}
I use dlsym() function in order to get the pointer of loadDex() but it returns 0. Anyone here can teach me how to get the exact pointer?
Thanks in advance!
loadDex was a private API of dalvik and doesn't exist in ART. This sort of thing should just be done in Java.

How to properly pass an asset FileDescriptor to FFmpeg using JNI in Android

I'm trying to retrieve metadata in Android using FFmpeg, JNI and a Java FileDescriptor and it isn't' working. I know FFmpeg supports the pipe protocol so I'm trying to emmulate: "cat test.mp3 | ffmpeg i pipe:0" programmatically. I use the following code to get a FileDescriptor from an asset bundled with the Android application:
FileDescriptor fd = getContext().getAssets().openFd("test.mp3").getFileDescriptor();
setDataSource(fd, 0, 0x7ffffffffffffffL); // native function, shown below
Then, in my native (In C++) code I get the FileDescriptor by calling:
static void wseemann_media_FFmpegMediaMetadataRetriever_setDataSource(JNIEnv *env, jobject thiz, jobject fileDescriptor, jlong offset, jlong length)
{
//...
int fd = jniGetFDFromFileDescriptor(env, fileDescriptor); // function contents show below
//...
}
// function contents
static int jniGetFDFromFileDescriptor(JNIEnv * env, jobject fileDescriptor) {
jint fd = -1;
jclass fdClass = env->FindClass("java/io/FileDescriptor");
if (fdClass != NULL) {
jfieldID fdClassDescriptorFieldID = env->GetFieldID(fdClass, "descriptor", "I");
if (fdClassDescriptorFieldID != NULL && fileDescriptor != NULL) {
fd = env->GetIntField(fileDescriptor, fdClassDescriptorFieldID);
}
}
return fd;
}
I then pass the file descriptor pipe # (In C) to FFmpeg:
char path[256] = "";
FILE *file = fdopen(fd, "rb");
if (file && (fseek(file, offset, SEEK_SET) == 0)) {
char str[20];
sprintf(str, "pipe:%d", fd);
strcat(path, str);
}
State *state = av_mallocz(sizeof(State));
state->pFormatCtx = NULL;
if (avformat_open_input(&state->pFormatCtx, path, NULL, &options) != 0) { // Note: path is in the format "pipe:<the FD #>"
printf("Metadata could not be retrieved\n");
*ps = NULL;
return FAILURE;
}
if (avformat_find_stream_info(state->pFormatCtx, NULL) < 0) {
printf("Metadata could not be retrieved\n");
avformat_close_input(&state->pFormatCtx);
*ps = NULL;
return FAILURE;
}
// Find the first audio and video stream
for (i = 0; i < state->pFormatCtx->nb_streams; i++) {
if (state->pFormatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_VIDEO && video_index < 0) {
video_index = i;
}
if (state->pFormatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_AUDIO && audio_index < 0) {
audio_index = i;
}
set_codec(state->pFormatCtx, i);
}
if (audio_index >= 0) {
stream_component_open(state, audio_index);
}
if (video_index >= 0) {
stream_component_open(state, video_index);
}
printf("Found metadata\n");
AVDictionaryEntry *tag = NULL;
while ((tag = av_dict_get(state->pFormatCtx->metadata, "", tag, AV_DICT_IGNORE_SUFFIX))) {
printf("Key %s: \n", tag->key);
printf("Value %s: \n", tag->value);
}
*ps = state;
return SUCCESS;
My issue is avformat_open_input doesn't fail but it also doesn't let me retrieve any metadata or frames, The same code works if I use a regular file URI (e.g file://sdcard/test.mp3) as the path. What am I doing wrong? Thanks in advance.
Note: if you would like to look at all of the code I'm trying to solve the issue in order to provide this functionality for my library: FFmpegMediaMetadataRetriever.
Java
AssetFileDescriptor afd = getContext().getAssets().openFd("test.mp3");
setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), fd.getLength());
C
void ***_setDataSource(JNIEnv *env, jobject thiz,
jobject fileDescriptor, jlong offset, jlong length)
{
int fd = jniGetFDFromFileDescriptor(env, fileDescriptor);
char path[20];
sprintf(path, "pipe:%d", fd);
State *state = av_mallocz(sizeof(State));
state->pFormatCtx = avformat_alloc_context();
state->pFormatCtx->skip_initial_bytes = offset;
state->pFormatCtx->iformat = av_find_input_format("mp3");
and now we can continue as usual:
if (avformat_open_input(&state->pFormatCtx, path, NULL, &options) != 0) {
printf("Metadata could not be retrieved\n");
*ps = NULL;
return FAILURE;
}
...
Even better, use <android/asset_manager.h>, like this:
Java
setDataSource(getContext().getAssets(), "test.mp3");
C
#include <android/asset_manager_jni.h>
void ***_setDataSource(JNIEnv *env, jobject thiz,
jobject assetManager, jstring assetName)
{
AAssetManager* assetManager = AAssetManager_fromJava(env, assetManager);
const char *szAssetName = (*env)->GetStringUTFChars(env, assetName, NULL);
AAsset* asset = AAssetManager_open(assetManager, szAssetName, AASSET_MODE_RANDOM);
(*env)->ReleaseStringUTFChars(env, assetName, szAssetName);
off_t offset, length;
int fd = AAsset_openFileDescriptor(asset, &offset, &length);
AAsset_close(asset);
Disclaimer: error checking was omitted for brevity, but resources are released correctly, except for fd. You must close(fd) when finished.
Post Scriptum: note that some media formats, e.g. mp4 need seekable protocol, and pipe: cannot help. In such case, you may try sprintf(path, "/proc/self/fd/%d", fd);, or use the custom saf: protocol.
Thks a lot for this post.
That help me a lot to integrate Android 10 and scoped storage with FFmpeg using FileDescriptor.
Here the solution I'm using on Android 10:
Java
URI uri = ContentUris.withAppendedId(
MediaStore.Audio.Media.EXTERNAL_CONTENT_URI,
trackId // Coming from `MediaStore.Audio.Media._ID`
);
ParcelFileDescriptor parcelFileDescriptor = getContentResolver().openFileDescriptor(
uri,
"r"
);
int pid = android.os.Process.myPid();
String path = "/proc/" + pid + "/fd/" + parcelFileDescriptor.dup().getFd();
loadFFmpeg(path); // Call native code
CPP
// Native code, `path` coming from Java `loadFFmpeg(String)`
avformat_open_input(&format, path, nullptr, nullptr);
OK, I spent a lot of time trying to transfer media data to ffmpeg through Assetfiledescriptor. Finally, I found that there may be a bug in mov.c. When mov.c parsed the trak atom, the corresponding skip_initial_bytes was not set. I have tried to fix this problem.
Detail please refer to FFmpegForAndroidAssetFileDescriptor, demo refer to WhatTheCodec.
FileDescriptor fd = getContext().getAssets().openFd("test.mp3").getFileDescriptor();
Think you should start with AssetFileDescripter.
http://developer.android.com/reference/android/content/res/AssetFileDescriptor.html

Categories

Resources