Use of android_media_Mediaplayer.cpp in Android - android

While searching about internal details of video player, I came across a pdf where MediaPlayer class internally uses android_media_Mediaplayer for every message(i.e, setDataSource(), prepare(), start() etc.) and android_media_MediaPlayer calls libmedia::MediaPlayer() with same message. My question is why can't MediaPlayer class directly call libmedia::MediaPlayer instead calling through android_media_MediaPlayer?
Thank you!
The link of image is given below...
http://img600.imageshack.us/img600/2005/capturejij.png

The diagram you linked to wasn't crystal clear, but I assume that the blue MediaPlayer box refers to the MediaPlayer Java class.
The libmedia MediaPlayer is a native class. Calls between Java and C/C++ need to go through the Java Native Interface (JNI), so android_media_MediaPlayer contains the necessary JNI code to communicate with the MediaPlayer Java class, thereby acting as a sort of proxy between the Java class and the native libmedia class.
For example, in MediaPlayer.java you'll find this delcaration:
public native void prepareAsync() throws IllegalStateException;
Which is listed in android_media_MediaPlayer as a JNINativeMethod:
{"prepareAsync", "()V", (void *)android_media_MediaPlayer_prepareAsync},
This says that the method which Java knows as "prepareAsync" has the signature "()V" (no arguments, returns void) and corresponds to the native function android_media_MediaPlayer_prepareAsync. When android_media_MediaPlayer_prepareAsync is called, it in turn calls the native MediaPlayer's prepareAsync method.

Related

Android ExoPlayer What Does "prepare()" Exactly Do?

The only difference I notice is that, if I call prepare() before play(), I will see the the process indicator and it pre-loads data in the PlayerView, besides that I can't tell the difference if I just call play() without prepare().
Also the documentation says nothing: https://exoplayer.dev/doc/reference/com/google/android/exoplayer2/SimpleExoPlayer.html
public void prepare()
Description copied from interface: Player
Prepares the player. //<- ???
Specified by:
prepare in interface Player

Sound metering react-native

Im building a react-native application.
Im trying to meter the current sound level (in decibel).
Libraries in use: react-native-audio and react-native-sound.
There is anybody familiar with this feature?
Thank you.
With the react-native-audio library, you can get the count for the only iOS.
For the currentMetering in Android, you need to customize in a native module. I have updated it add the following things in your package.json file instead of your code. You will get the currentMetering count for the Android as well as iOS.
"react-native-audio": "git+https://github.com/Harsh2402/react-native-audio.git",
You can use react-native-audio currentMetering value - in order to get the sound level in real-time.
First, you will have to initialise your recorder (which i will assume youve done). I use prepareRecordingAtPath in a similar way to below
AudioRecorder.prepareRecordingAtPath(audioPath, {
SampleRate: 22050,
Channels: 1,
AudioQuality: "Low",
AudioEncoding: "aac",
MeteringEnabled: true
});
then once you've called AudioRecorder.startRecording(); (Note you have access to .pause() and .stop() methods also
Now for handling the audio level, you are going to have to retrieve the data that is returned by the onProgress method. From what i remember, there should be some currentMetering value that you can access. Note that defining this behaviour will trigger the action every time a different decibel reading is retrieved. Like so
AudioRecorder.onProgress = data => {
let decibels = Math.floor(data.currentMetering);
//DO STUFF
};
Hope this helps,

Calling a function using the context pointer using JNI on android causes a segfault

I found this bit of code in one of the example tango projects using the JNI and I have no idea what the context is nor how to use it. The example code works, but my code does not.
void OnXYZijAvailableRouter(void *context, const TangoXYZij *xyz_ij) {
SynchronizationApplication *app =
static_cast<SynchronizationApplication *>(context);
app->OnXYZijAvailable(xyz_ij);
}
I tried mimicking it below:
void OnFrameAvailableRouter(void *context, const TangoCameraId id,
const TangoImageBuffer *buffer) {
SynchronizationApplication *app =
static_cast<SynchronizationApplication *>(context);
LOGE("Before onframe call.");
app->onFrameAvailable(id, buffer);
LOGE("After onframe call.");
}
When I try to run it, however, I get this output:
Before onframe call.
Fatal signal 11 (SIGSEGV) at 0x00000308 (code=1), thread 15673 (Binder_2)
Now I managed to find the pointer that causes the seg fault, but I have no idea why it does not work.
Naturally, I might have done something wrong, but I have no idea what since I made an exact copy of the code in the example.
int SynchronizationApplication::TangoConnectCallbacks() {
TangoErrorType depth_ret =
TangoService_connectOnXYZijAvailable(OnXYZijAvailableRouter);
depth_ret = TangoService_connectOnFrameAvailable(TangoCameraId::TANGO_CAMERA_COLOR, NULL,
OnFrameAvailableRouter);
return depth_ret;
}
The functions I call from the routers.
void OnXYZijAvailable(const TangoXYZij *xyz_ij);
void onFrameAvailable(const TangoCameraId id, const TangoImageBuffer *buffer);
What exactly is the context? I have read some explanations, but I still do not understand why I can call the function using the context in the example above, nor why I need the router function at all. I have read this SO answer and the android page on the concept, but I see no link between the context and my class.
In the OnXYZijAvailableRouter (the depth callback), the context is the instance passed in from the TangoService_connect function. I believe in the application class, there should be a line like this: TangoService_connect(this, tango_config_); So this become the context when the callback is called. This context also applies to pose and event callbacks.
In the case of OnFrameAvailableRouter, the context is the instance you passed in in the TangoService_connectOnFrameAvailable. In this case, the code is setting a NULL as context, but in the callback, it's trying the call a function on NULL. That's the crash point.
I believe if you change the it to TangoService_connectOnFrameAvailable(TangoCameraId::TANGO_CAMERA_COLOR, this, OnFrameAvailableRouter); it should work fine.
The router function is for the callbacks, I haven't find a way of giving a function pointer of a instance to the API. But let me know if you find a way to do that, I would like to know as well..

how to catch the value of dialpads pressed button?

I am developing a SIP application for making and receiving a call. For that purpose I did analysis on open source project SipDroid. in that project how they catch the value of dialpads pressed button which is sent to the particular method for making a SIP call.
I tried to find the code for that task but I didn't get anything.in which file the code is resides to catch that value in SipDroid project?
The calls in SipDroid are handled by the SipdroidEngine:
org.sipdroid.sipua.SipdroidEngine
The method that handles the initial operation is with signature public boolean call(String target_url,boolean force) - it transfers the call to the SipDroid UserAgent class and so on, until it reaches the network transport layer. Just check the references of this call method in the whole project and see where it's used.
The dialpad values are called DTMF (Dual-tone multi-frequency signaling).
Most of SipDroid's DTMF stuff is in dtmf.h.
You can search through the source code to see where it is used.

How to know when MediaRecorder has finished writing data to file

We're using MediaRecorder to record video to a file on the external storage using setOutputFile() before doing the actual recording.
Everything works fine, but the main issue is that as soon as the recording is done, we want to start playing the recorded video back in a VideoView.
How to know when the file is ready to be read and played back?
The FileObserver class suits your needs perfectly. Here is the documentation. It's easy to use. When a observed file is closed after writing, the onEvent callback is called with CLOSE_WRITE as the parameter.
MyFileObserver fb = new MyFileObserver(mediaFile_path, FileObserver.CLOSE_WRITE);
fb.startWatching();
class MyFileObserver extends FileObserver {
public MyFileObserver (String path, int mask) {
super(path, mask);
}
public void onEvent(int event, String path) {
// start playing
}
}
Don't forget to call stopWatching().
We solved similar problem with the following algo:
while (file not complete)
sleep for 1 sec
read the fourth byte of the file
if it is not 0 (contains 'f' of the 'ftyp' header) then
file is complete, break
The key point is that MediaRecorder writes the ftyp box at the very last moment. If it is in place, then the file is complete.
In my tests irrespective of the size of the recording mediaRecorder.stop() is a blocking method that only returns after the file has been completely written and closed by the media recorder.
So JPMs answer is actually correct.
You can verify this by calling File.length() immediately after stop(). You will find that the output file length is the final length of the file at this point. In other words media recorder does not write anything further to the file after stop() has returned.
I haven't tried this myself but this might work:
public void release () Since: API Level 1
Releases resources associated with this MediaRecorder object. It is
good practice to call this method when you're done using the
MediaRecorder.
If it does what it says, then I guess if you call this and after this method returns you know the file is ready.
Apparently there is no way to detect when the recording has stopped in Media player, but there is a stop() that you can override if you create a custom class that implements MediaRecorder. here I would do something like this:
public class MyRecorder implements MediaRecorder {
public boolean stopped;
.... implement all the methods that MediaRecorder has making
sure to call super for each method.
#Override
public void myStop() {
this.stopped = true;
super.stop();
}
}
Then you can access the boolean to see if it has stopped recording.
A dirty way would be to check the lastModified() value of the File and open the VideoView if the File wasn't modified for 2 seconds.

Categories

Resources