Why am I getting FPE when using swresample 1.1? - android

I am building a project to view a video feed from an IP camera in Android using FFmpeg 1.1.
I'm attempting to use swresample in an Android project and getting a floating point exception when calling swr_convert. I stepped through the swresample code and found one line in libswresample/swresample.c function swri_realloc_audio where the variables a->bps and a->ch_count are zero causing the FPE.
int swri_realloc_audio(AudioData *a, int count){
int i, countb;
AudioData old;
LOGD("in swri_realloc_audio - bps[%d], ch_count[%d]", a->bps, a->ch_count);
if(count < 0 || count > INT_MAX/2/a->bps/a->ch_count)
return AVERROR(EINVAL);
01-21 17:29:09.612: D/swresample.c(18789): in swri_realloc_audio - bps[0], ch_count[0]
I found bug ticket #1834 in the FFmpeg project that sounds like the exact same issue, but it was resolved by calling swr_init. However, my code does call this function and still crashes. Here is my JNI code:
SwrContext* resampleCtx = swr_alloc_set_opts(NULL,
AV_CH_LAYOUT_MONO, AV_SAMPLE_FMT_S16, pAudioCodecCtx->sample_rate,
pAudioCodecCtx->channel_layout, pAudioCodecCtx->sample_fmt,
pAudioCodecCtx->sample_rate, 0, 0);
swr_init(resampleCtx);
LOGD("Resample context initialized");
int dataSize = swr_convert(resampleCtx,
&pAudioOutBuffer, AVCODEC_MAX_AUDIO_FRAME_SIZE / 2,
(const uint8_t**) &(pFrame->data[0]), pFrame->nb_samples);
LOGD("Resample conversion complete");
swr_free(&resampleCtx);
LOGD("Obtained data size - dataSize[%d]", dataSize);
I'm confused because I don't seem to have any control over the variable a in the swri_realloc_audio function. I stepped through the code and noticed that it is from the variable resampleCtx->postin. This variable is copied from resampleCtx->in in the swr_init function, but I don't see where in is ever set to anything.
What am I doing wrong? Is it in my code or is there a problem in swresample?

The answer here is I made a mistake in the input. AV_SAMPLE_FMT_S16 is not supported by swr_convert and the function call to swr_init was failing. I just wasn't checking the result to know this.

Related

Android NDK Pointer Arithmetic

I am trying to load a TGA file in Android NDK.
I open the file using AssetManager, read in the entire contents of the TGA file into a memory buffer, and then I try to extract the pixel data from it.
I can read the TGA header part of the file without any problems, but when I try to advance the memory pointer past the TGA header, the app crashes. If I don't try to advance the memory pointer, it does not crash.
Is there some sort of limitation in Android NDK for pointer arithmetic?
Here is the code:
This function opens the asset file:
char* GEAndroid::OpenAssetFile( const char* pFileName )
{
char* pBuffer = NULL;
AAssetManager* assetManager = m_pState->activity->assetManager;
AAsset* assetFile = AAssetManager_open(assetManager, pFileName, AASSET_MODE_UNKNOWN);
if (!assetFile) {
// Log error as 'error in opening the input file from apk'
LOGD( "Error opening file %s", pFileName );
}
else
{
LOGD( "File opened successfully %s", pFileName );
const void* pData = AAsset_getBuffer(assetFile);
off_t fileLength = AAsset_getLength(assetFile);
LOGD("fileLength=%d", fileLength);
pBuffer = new char[fileLength];
memcpy( pBuffer, pData, fileLength * sizeof( char ) );
}
return pBuffer;
}
And down here in my texture class I try to load it:
char* pBuffer = g_pGEAndroid->OpenAssetFile( fileNameWithPath );
TGA_HEADER textureHeader;
char *pImageData = NULL;
unsigned int bytesPerPixel = 4;
textureHeader = *reinterpret_cast<TGA_HEADER*>(pBuffer);
// I double check that the textureHeader is valid and it is.
bytesPerPixel = textureHeader.bits/8; // Divide By 8 To Get The Bytes Per Pixel
m_imageSize = textureHeader.width*textureHeader.height*bytesPerPixel; // Calculate The Memory Required For The TGA Data
pImageData = new char[m_imageSize];
// the line below causes the crash
pImageData = reinterpret_cast<char*>(pBuffer + sizeof( TGA_HEADER)); // <-- causes a crash
If I replace the line above with the following line (even though it is incorrect), the app runs, although obviously the texture is messed up.
pImageData = reinterpret_cast<char*>(pBuffer); // <-- does not crash, but obviously texture is messed up.
Anyone have any ideas?
Thanks.
Why reinterpret_cast? You're adding an integer to a char*; that operation produces a char*. No typecast necessary.
One caveat for pointer juggling on Android (and on ARM devices in general): ARM cannot read/write unaligned data from memory. If you read/write an int-sized variable, it needs to be at an address that's a multiple of 4; for short, a multiple of 2. Bytes can be at any address. This does not, as far as I can see, apply to the presented snippet. But do keep in mind. It does throw off binary format parsing occasionally, especially when ported from Intel PCs.
Simply assigning an unaligned value to a pointer does not crash. Dereferencing it might.
Sigh, I just realized the mistake. I allocate memory for pImageData, then set the point to the buffer. This does not sit well when I try to create an OpenGL texture with the pixel data. Modifying it so I memcpy the pixel data from (pBuffer + sizeof( TGA_HEADER) ) to pImageData fixes the problem.

Android NDK and __android_log_print

I using this lib: https://github.com/mysolution/hyphenator In JNI I create this function:
int main2()
{
//load russian hyphenation patterns
struct pattern_list_t* plist = create_pattern_list();
size_t i = 0;
while (patterns[i])
{
struct pattern_t* p = create_pattern(patterns[i], isdigit_func, ismarker_func, char2digit_func);
add_patern(plist, p);
++i;
}
sort_pattern_list(plist);
//hyphenate test words
size_t word_index = 0;
while (test_words[word_index])
{
struct word_hyphenation_t* wh = hyphenate_word(test_words[word_index], plist, marker);
i = 0;
while (test_words[word_index][i])
{
__android_log_print(ANDROID_LOG_INFO, "HelloNDK!", "%c", test_words[word_index][i]);
++i;
}
destroy_word_hyphenation(wh);
++word_index;
}
//cleanup
destroy_pattern_list(plist);
return 0;
}
In Android NDK this work, but I get in LogCat:
02-21 16:15:18.989: INFO/HelloNDK!(403): �
How to solve this problem? I think that problem in encoding, but i don't know how to solve this.
What is your expected output? If the character falls outside the realm of ASCII you'll of course need to have something to view logcat that supports it. Assuming you're outputting UTF-8, Terminator is nice on Linux and Mintty (In combination with Cygwin/etc.) on Windows.
I worked it out, and this seems very wrong to me.....
So for char* concatenation in __android_log_vprint and __android_log_print it would appear you need to use the escape %s not %c.
This totally scuppers my plans for making a cross platform char* log between iOS, Android and Blackberry as printf("%s",myString.c_str()); is illegal. Will have to get funky with the args and parse the string. Anyway that's another problem and there is your fix ....

How to get MJPG stream video from android IPWebcam using opencv

I am using the IP Webcam program on android and receiving it on my PC by WiFi. What I want is to use opencv in Visual Studio, C++, to get that video stream, there is an option to get MJPG stream by the following URL: http://MyIP:port/videofeed
How to get it using opencv?
Old question, but I hope this can help someone (same as my answer here)
OpenCV expects a filename extension for its VideoCapture argument,
even though one isn't always necessary (like in your case).
You can "trick" it by passing in a dummy parameter which ends in the
mjpg extension:
So perhaps try:
VideoCapture vc;
ipCam.open("http://MyIP:port/videofeed/?dummy=param.mjpg")
Install IP Camera Adapter and configure it to capture the videostream. Then install ManyCam and you'll see "MPEG Camera" in the camera section.(you'll see the same instructions if you go to the link on how to setup IPWebCam for skype)
Now you can access your MJPG stream just like a webcam through openCV. I tried this with OpenCV 2.2 + QT and works well.
Think this helps.
I did a dirty patch to make openCV working with android ipWebcam:
In the file OpenCV-2.3.1/modules/highgui/src/cap_ffmpeg_impl.hpp
In the function bool CvCapture_FFMPEG::open( const char* _filename )
replace:
int err = av_open_input_file(&ic, _filename, NULL, 0, NULL);
by
AVInputFormat* iformat = av_find_input_format("mjpeg");
int err = av_open_input_file(&ic, _filename, iformat, 0, NULL);
ic->iformat = iformat;
and comment:
err = av_seek_frame(ic, video_stream, 10, 0);
if (err < 0)
{
filename=(char*)malloc(strlen(_filename)+1);
strcpy(filename, _filename);
// reopen videofile to 'seek' back to first frame
reopen();
}
else
{
// seek seems to work, so we don't need the filename,
// but we still need to seek back to filestart
filename=NULL;
int64_t ts = video_st->first_dts;
int flags = AVSEEK_FLAG_FRAME | AVSEEK_FLAG_BACKWARD;
av_seek_frame(ic, video_stream, ts, flags);
}
That should work. Hope it helps.
This is the solution (im using IP Webcam on android):
CvCapture* capture = 0;
capture = cvCaptureFromFile("http://IP:Port/videofeed?dummy=param.mjpg");
I am not able to comment, so im posting new post. In original answer is an error - used / before dummy. THX for solution.
Working example for me
// OpenCVTest.cpp : Defines the entry point for the console application.
//
#include "stdafx.h"
#include "opencv2/highgui/highgui.hpp"
/**
* #function main
*/
int main( int argc, const char** argv )
{
CvCapture* capture;
IplImage* frame = 0;
while (true)
{
//Read the video stream
capture = cvCaptureFromFile("http://192.168.1.129:8080/webcam.mjpeg");
frame = cvQueryFrame( capture );
// create a window to display detected faces
cvNamedWindow("Sample Program", CV_WINDOW_AUTOSIZE);
// display face detections
cvShowImage("Sample Program", frame);
int c = cvWaitKey(10);
if( (char)c == 27 ) { exit(0); }
}
// clean up and release resources
cvReleaseImage(&frame);
return 0;
}
Broadcast mjpeg from a webcam with vlc, how described at http://tumblr.martinml.com/post/2108887785/how-to-broadcast-a-mjpeg-stream-from-your-webcam-with

Android app restarts automatically after a crash

My app is partly written in native app using C/C++. The problem is that whenever C/C++ part crashes for some reason the app dies and then restarts automatically. This causes all kinds of messy problems
Now of course, it should not crash in the native part and I'm trying to weed out all reasons why it would happen. However, if it does happen I'd like to:
Quit gracefully
If it does die, at least not try to restart automatically.
I'm curious as to why this behaviour happens. After some search I tried putting the following line in the main activity element of the AndroidManifest.xml:
android:finishOnTaskLaunch="true"
but the automatic restore still happens.
Anyone knows why this is happening and how to change it?
UPDATE:
I think a more fundamental question is,
Is there something similar to a callback if there is a native crash?
One of the answers suggested 'handling crash signals'. I'd be grateful for any links on how it can be done at an application or module level.
As it stands currently, if there is a crash the app just disappears, there's nothing in logcat, so no debugging is possible.
Try to handle crash signals (SIGSEGV etc.) and send kill to yourself in signal handler. This trick helps me.
Example:
#include <signal.h>
#include <unistd.h>
static void signal_handler(int signal, siginfo_t *info, void *reserved)
{
kill(getpid(),SIGKILL);
}
extern "C" jint JNI_OnLoad(JavaVM* vm, void* /*reserved*/)
{
struct sigaction handler;
memset(&handler, 0, sizeof(handler));
handler.sa_sigaction = signal_handler;
handler.sa_flags = SA_SIGINFO;
sigaction(SIGILL, &handler, NULL);
sigaction(SIGABRT, &handler, NULL);
sigaction(SIGBUS, &handler, NULL);
sigaction(SIGFPE, &handler, NULL);
sigaction(SIGSEGV, &handler, NULL);
sigaction(SIGSTKFLT, &handler, NULL);
return(JNI_VERSION_1_6);
}
UPDATE2
if you want to see crashlog in android logcat you should use this signal handler
static void signal_handler(int signal, siginfo_t *info, void *reserved)
{
struct sockaddr_un addr;
size_t namelen;
socklen_t alen;
int s, err;
char name[] = "android:debuggerd";
namelen = strlen(name);
// Test with length +1 for the *initial* '\0'.
if ((namelen + 1) > sizeof(addr.sun_path)) {
errno = EINVAL;
return;
}
/* This is used for abstract socket namespace, we need
* an initial '\0' at the start of the Unix socket path.
*
* Note: The path in this case is *not* supposed to be
* '\0'-terminated. ("man 7 unix" for the gory details.)
*/
memset (&addr, 0, sizeof addr);
addr.sun_family = AF_LOCAL;
addr.sun_path[0] = 0;
memcpy(addr.sun_path + 1, name, namelen);
alen = namelen + offsetof(struct sockaddr_un, sun_path) + 1;
s = socket(AF_LOCAL, SOCK_STREAM, 0);
if(s < 0) return;
RETRY_ON_EINTR(err,connect(s, (struct sockaddr *) &addr, alen));
if (err < 0) {
close(s);
s = -1;
}
pid_t tid = gettid();
if(s>=0)
{
/* debugger knows our pid from the credentials on the
* local socket but we need to tell it our tid. It
* is paranoid and will verify that we are giving a tid
* that's actually in our process
*/
int ret;
RETRY_ON_EINTR(ret, write(s, &tid, sizeof(unsigned)));
if (ret == sizeof(unsigned)) {
/* if the write failed, there is no point to read on
* the file descriptor. */
RETRY_ON_EINTR(ret, read(s, &tid, 1));
//notify_gdb_of_libraries();
}
close(s);
}
wait(NULL);
kill(getpid(),SIGKILL);
}
I took it from android source (can't insert link because android.git.kernel.org is down), but I am not sure that it will work in future Android releases
By default your application should not be automatically restarting. Generally one would have to register for this kind of thing, e.g. via the AlarmManager/keep alives.
Do you have a service as part of your application?

ffmpeg - avcodec_decode_audio3 always returns 0 with aac decoding on android

I am writing an audio decoder based on ffmpeg for android, where I have to decode aac audio, but because of some reason it always returns 0 bytes decoded.
Looks like I pass everything right. Can anybody tell me what went wrong in my case.I copied code from ffplay.c.
What is the reason avcodec_decode_audio3 function always returns zero?
Here is the code from ffplay.c :
AVPacket *pkt_temp = &is->audio_pkt_temp;
AVPacket *pkt = &is->audio_pkt;
AVCodecContext *dec= is->audio_st->codec;
int n, len1, data_size;
double pts;
data_size = sizeof(is->audio_buf1);
len1 = avcodec_decode_audio3(dec, (int16_t *)is->audio_buf1, &data_size, pkt_temp);
if (len1 < 0) {
pkt_temp->size = 0;
break;
}
if (data_size <= 0){
//This block always gets executed.
continue;
}
I am able to resolve my problem, not with the above code. But I completely rewrote my audio code, which doesn't have any threads or waiting mechanism. I suspect that the problem doesn't lie in the above code, but it is in the way I pass the data to a function.
When you do:
data_size = sizeof(is->audio_buf1);
You are getting the size of a pointer, instead of the size of the buffer pointed by is->audio_buf1

Categories

Resources