Equalizers, bassboost and reverb effect not working (using FFmpegMediaPlayer) - android

I'm currently using FFmpegMediaPlayer from github and the effects are not working in the phone but works perfectly in the emulator which of both are the same API 22.
The strange thing is that when I switch the code from FFmpegMediaplayer to standard android media player the effects start working again in the real phone device. But when I switch back to ffpmeg the effect only works in the emulator and not in the real device. My code is as below,
public void setupVisualizerFxAndUI() {
try {
mVisualizer = new Visualizer(mMediaPlayer.getAudioSessionId());
mEqualizer = new Equalizer(0, mMediaPlayer.getAudioSessionId());
mEqualizer.setEnabled(true);
try {
bassBoost = new BassBoost(0, mMediaPlayer.getAudioSessionId());
bassBoost.setEnabled(false);
BassBoost.Settings bassBoostSettingTemp = bassBoost.getProperties();
BassBoost.Settings bassBoostSetting = new BassBoost.Settings(bassBoostSettingTemp.toString());
bassBoostSetting.strength = (1000 / 19);
bassBoost.setProperties(bassBoostSetting);
mMediaPlayer.setAuxEffectSendLevel(1.0f);
presetReverb = new PresetReverb(0, mMediaPlayer.getAudioSessionId());
presetReverb.setPreset(PresetReverb.PRESET_NONE);
presetReverb.setEnabled(false);
mMediaPlayer.setAuxEffectSendLevel(1.0f);
} catch (Exception e) {
e.printStackTrace();
}
} catch (Exception e) {
e.printStackTrace();
}
if (homeActivity.isEqualizerEnabled) {
try {
bassBoost.setEnabled(true);
BassBoost.Settings bassBoostSettingTemp = bassBoost.getProperties();
BassBoost.Settings bassBoostSetting = new BassBoost.Settings(bassBoostSettingTemp.toString());
if (homeActivity.bassStrength == -1) {
bassBoostSetting.strength = (1000 / 19);
} else {
bassBoostSetting.strength = homeActivity.bassStrength;
}
bassBoost.setProperties(bassBoostSetting);
mMediaPlayer.setAuxEffectSendLevel(1.0f);
if (homeActivity.reverbPreset == -1) {
presetReverb.setPreset(PresetReverb.PRESET_NONE);
} else {
presetReverb.setPreset(homeActivity.reverbPreset);
}
presetReverb.setEnabled(true);
mMediaPlayer.setAuxEffectSendLevel(1.0f);
} catch (Exception e) {
e.printStackTrace();
}
}
if (homeActivity.isEqualizerEnabled && homeActivity.isEqualizerReloaded) {
try {
homeActivity.isEqualizerEnabled = true;
int pos = homeActivity.presetPos;
if (pos != 0) {
mEqualizer.usePreset((short) (pos - 1));
} else {
for (short i = 0; i < 5; i++) {
mEqualizer.setBandLevel(i, (short) homeActivity.seekbarpos[i]);
}
}
if (homeActivity.bassStrength != -1 && homeActivity.reverbPreset != -1) {
bassBoost.setEnabled(true);
bassBoost.setStrength(homeActivity.bassStrength);
presetReverb.setEnabled(true);
presetReverb.setPreset(homeActivity.reverbPreset);
}
mMediaPlayer.setAuxEffectSendLevel(1.0f);
} catch (Exception e) {
e.printStackTrace();
}
}
where mMediaPlayer is ffmpeg...Other than that the library is working fine in regards to streaming. The only problem is that it doesn't get any effect put in. I thought this might be a coding problem so I just switched ffmpeg with Android standard media player like I mentioned above and it works. FFmpeg - bass boost and equalizer only works in the emulator and not in real phone device.
Another strange thing was that the effect initially worked at first in debug run mode and stopped working after I signed the apk. From which point on it stopped working both in the debug as well as any other run modes i.e - release also....I'm not using any pro guard rules also.
Points to note :
1. Replacing FFmpegmediaplayer with Standard Media player the effects works.
2. Effects worked before signing the apk then stopped working in all run modes
3. Using the same code above for FFMpegmediaplayer effects only work in the
Emulator and not in real device.
4. Other than the effects problem, FFmpegmediaplayer is functional regarding
streaming and local playback - in real phone device as well as emulator.

Never mind found the answer myself. For those of you having the same problem I suggest changing the libraries manually and hope it works. Cause its an issue or a bug related to the library being used.

Related

Tracking Android device in space using device camera

I am looking to get device position (x, y, and z) in space in the most efficient way possible. Currently I am using ArSceneView (ArCore) to get the cameras pose as it updates.
The problem is that ArSceneView or the class it extends from seems to have a memory leak that eventually crash my app. I have tried forcing garbage collection and increasing the heap size but the app is still crashing.
I do not need view to display the camera stream so a class with a view is not necessary, I just need a way to get the mobiles x y and z coordinates. Does anyone know of a way to get these using MLKit or pure ArCore (no view)?
Side note: my app does not crash when the android studio profiler is on... anyone know why?
WHERE I START AND STOP AR MODULE:
public void initializeSession() {
if (sessionInitializationFailed) {
return;
}
UnavailableException sessionException;
try {
session = new Session(context);
Config config = new Config(session);
config.setUpdateMode(Config.UpdateMode.LATEST_CAMERA_IMAGE);
setupSession(session);
session.configure(config);
return;
} catch (Exception e) {
e.printStackTrace();
sessionException = new UnavailableException();
sessionException.initCause(e);
}
sessionInitializationFailed = true;
}
void start() {
System.gc();
if (isStarted) {
return;
}
isStarted = true;
try {
initializeSession();
this.resume();
} catch (CameraNotAvailableException ex) {
sessionInitializationFailed = true;
}
}
void stop() {
if (!isStarted) {
return;
}
isStarted = false;
this.pause();
this.stop();
System.gc();
}
WHERE I LISTEN FOR CHANGES:
public void onUpdate(FrameTime frameTime) {
if (arModule == null) {
return;
}
Frame frame = arModule.getArFrame();
if (frame == null) {
return;
}
Camera camera = frame.getCamera();
Pose pose = camera.getPose();
TrackingState trackingState = camera.getTrackingState();
if (trackingState == TrackingState.TRACKING) {
float x = pose.tx();
float y = pose.ty();
float z = pose.tz();
}
}
In the HelloAR sample app, you can clearly see that they are getting the pose of the camera. Their app does not crash, so it is not a bug. It's hard to tell from your code whether you are getting a memory leak somewhere else. Just add the bit about storing camera pose inside the HelloAR sample app.

LibStreaming gives black screen when change resolution

I have using spydroid from https://github.com/fyhertz/spydroid-ipcamera.
Based on the requirement streaming should be send and receive in the device. from local network we should able to shown the rtsp stream. Ex. VLC Media Player.
The issue I am facing is, When I am change the resolution Ex. 640*480. It should give black screen with streaming live. In Default demo, It should support 320*240, which is working fine. I have also change the bitrate and framerate according to 640*480 resolution. But couldn't get the result.
Any help would be appreciate.
You might be using old library which the demo SpyDroid is using.
I had same issue with the code i tried following way :-
Steps:-
1.)include library LibStreaming .
As it is the latest library and supporting above Lollipop version.
2.)Find H263Stream class Change following method :-
From
#SuppressLint("NewApi")
private MP4Config testMediaCodecAPI() throws RuntimeException, IOException {
createCamera();
updateCamera();
try {
if (mQuality.resX>=640) {
// Using the MediaCodec API with the buffer method for high resolutions is too slow
mMode = MODE_MEDIARECORDER_API;
}
EncoderDebugger debugger = EncoderDebugger.debug(mSettings, mQuality.resX, mQuality.resY);
return new MP4Config(debugger.getB64SPS(), debugger.getB64PPS());
} catch (Exception e) {
// Fallback on the old streaming method using the MediaRecorder API
Log.e(TAG,"Resolution not supported with the MediaCodec API, we fallback on the old streamign method.");
mMode = MODE_MEDIARECORDER_API;
return testH264();
}
}
To
#SuppressLint("NewApi")
private MP4Config testMediaCodecAPI() throws RuntimeException, IOException {
createCamera();
updateCamera();
try {
if (mQuality.resX>=1080) {
// Using the MediaCodec API with the buffer method for high resolutions is too slow
mMode = MODE_MEDIARECORDER_API;
}
EncoderDebugger debugger = EncoderDebugger.debug(mSettings, mQuality.resX, mQuality.resY);
return new MP4Config(debugger.getB64SPS(), debugger.getB64PPS());
} catch (Exception e) {
// Fallback on the old streaming method using the MediaRecorder API
Log.e(TAG,"Resolution not supported with the MediaCodec API, we fallback on the old streamign method.");
mMode = MODE_MEDIARECORDER_API;
return testH264();
}
}
-You can find the difference change resolution from "640" into "1080"
-Don't know what exact reason is but above solution worked for me.
-Revert me back if there is any loop fall.
In my case the problem was in MediaRecorder:
file: VideoStream.java
method: encodeWithMediaRecorder
MediaRecorder doesn't support ParcelFileDescriptor correctly. I created local file to save stream form MediaRecorder:
mMediaRecorder.setOutputFile(this.tmpFileToStream);
//mMediaRecorder.setOutputFile(fd); //disable..
and then I run new thread to copy bytes from tmpFileToStream to mParcelWrite
public void run()
{
FileDescriptor fd = mParcelWrite.getFileDescriptor();
try {
InputStream isS = new FileInputStream(tmpFileToStream); //read from local file..
FileOutputStream outputStream = new FileOutputStream(fd);
while (!Thread.interrupted()) {
int content;
while ((content = isS.read()) != -1) {
outputStream.write(content);
}
Thread.sleep(10);
}
} catch (Exception e)
{
Log.e(TAG, "E.. " + e.getMessage());
}
}
/**
* Video encoding is done by a MediaRecorder.
*/
protected void encodeWithMediaRecorder() throws IOException, ConfNotSupportedException {
Log.d(TAG,"Video encoded using the MediaRecorder API");
Log.d(TAG,"Roz" + mRequestedQuality.resX + " x " + mRequestedQuality.resY + " frame " + mRequestedQuality.framerate );
// We need a local socket to forward data output by the camera to the packetizer
createSockets();
// Reopens the camera if needed
destroyCamera();
createCamera();
// The camera must be unlocked before the MediaRecorder can use it
unlockCamera();
this.tmpFileToStream = this.getOutputMediaFile(MEDIA_TYPE_VIDEO);
Log.d(TAG,"Video record to " + this.tmpFileToStream.getAbsolutePath() );
try {
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
// mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mMediaRecorder.setVideoEncoder(mVideoEncoder);
mMediaRecorder.setPreviewDisplay(mSurfaceView.getHolder().getSurface());
mMediaRecorder.setVideoSize(mRequestedQuality.resX,mRequestedQuality.resY);
mMediaRecorder.setVideoFrameRate(mRequestedQuality.framerate);
// The bandwidth actually consumed is often above what was requested
mMediaRecorder.setVideoEncodingBitRate((int)(mRequestedQuality.bitrate*0.8));
// We write the output of the camera in a local socket instead of a file !
// This one little trick makes streaming feasible quiet simply: data from the camera
// can then be manipulated at the other end of the socket
FileDescriptor fd = null;
if (sPipeApi == PIPE_API_PFD) {
fd = mParcelWrite.getFileDescriptor();
} else {
fd = mSender.getFileDescriptor();
}
mMediaRecorder.setOutputFile(this.tmpFileToStream); //save to local.. afeter read and stream that..
// mMediaRecorder.setOutputFile(fd); //disable..
// copy bytes from local file to mParcelWrite.. :)
if(tInput == null) {
this.tInput = new Thread(this);
this.tInput.start();
}
mMediaRecorder.prepare();
mMediaRecorder.start();
} catch (Exception e) {
StringWriter sw = new StringWriter();
PrintWriter pw = new PrintWriter(sw);
e.printStackTrace(pw);
Log.d(TAG, "Error stack " + sw.toString());
throw new ConfNotSupportedException(e.getMessage());
}
InputStream is = null;
if (sPipeApi == PIPE_API_PFD) {
is = new ParcelFileDescriptor.AutoCloseInputStream(mParcelRead);
} else {
is = mReceiver.getInputStream();
}
// This will skip the MPEG4 header if this step fails we can't stream anything :(
try {
byte buffer[] = new byte[4];
// Skip all atoms preceding mdat atom
while (!Thread.interrupted()) {
while (is.read() != 'm');
is.read(buffer,0,3);
if (buffer[0] == 'd' && buffer[1] == 'a' && buffer[2] == 't') break;
}
} catch (IOException e) {
Log.e(TAG,"Couldn't skip mp4 header :/");
stop();
throw e;
}
// The packetizer encapsulates the bit stream in an RTP stream and send it over the network
mPacketizer.setInputStream(is);
mPacketizer.start();
mStreaming = true;
}
That's unusual solution but it works for another resolution

Apps did not download after vpn is connected

I am making an application that block certain webs and application but problem is on Some devices manufacture Like Motorola and HTC (android OS nougat and oreo) apps did not download and update after VPN is Connected But Internet Work perfectly.I am using Local VPN to monitor Network Trafic. Please Help i am stuck on this stage.
I tried all methods mention on the google
https://www.androidpit.com/google-play-not-working
http://appslova.com/android-fix-error-495-in-google-play-store/
http://techknowzone.com/how-to-solve-fix-error-code-495-in-google-play-store/
Below is the Snipped of local VPN Code
private void connect() {
int i;
isRunning=true;
Builder builder = new Builder();
StringBuilder stringBuilder = new StringBuilder("10.0.0.");
if (lastInt == 254) {
i = 0;
lastInt = 0;
} else {
i = lastInt;
lastInt = i + 1;
}
this.localAddress = stringBuilder.append(i).toString();
try {
if (this.parcelFileDescriptor != null) {
this.parcelFileDescriptor.close();
}
} catch (Exception e) {
}
try {
builder.addAddress(this.localAddress, 24);
builder.addDnsServer(dns1);
builder.addDnsServer(dns2);
this.parcelFileDescriptor = builder.setSession(getString(R.string.app_name)).setConfigureIntent(this.pendingIntent).establish();
Intent intent = new Intent("STARTEDDNSCHANGER");
intent.setAction(getPackageName() + ".STARTED_DNS_CHANGER");
sendBroadcast(intent);
} catch (Throwable e2) {
throw new IllegalArgumentException(e2);
}
}
Play downloads apps over HTTPS. If you are getting error 495 it suggests that the VPN is interfering with the HTTPS / TLS handshaking, and so Play won't download the apps. Possibly you don't support the cipher algorithm negotiation properly.

Android play sound without lag

I want to play a horn sound in the app without any lag. I am using Media player class but its giving lag while playing file again.
code to run:
thread to improve the lag:(mp_horn is the media player instance we made from sound file) Below thread gives us much better result then making mediaplayer.setloop(true)
#Override
public void run() {
try {
if (mp_horn != null && mp_horn.isPlaying()) {
final long durationTotal_horn = mp_horn
.getDuration();
long durationCurrent_horn = mp_horn
.getCurrentPosition();
if (durationCurrent_horn >= (.90) * durationTotal_horn) {
// mp_engineContiue.seekTo((int)
// durationCurrent_back);
Log.v("arrrrrr", durationCurrent_horn
+ "......."
+ durationTotal_horn);
// mp_engineContiue.pause();
mp_horn.seekTo((int) (durationTotal_horn * .0000001));
}
}
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Probably
if (mp_horn != null && mp_horn.isPlaying())
should be :
if (mp_horn != null && !mp_horn.isPlaying())
otherwise while the Horn is playing you are doing all these calculations all over again, which I am guessing is causing the lag

Using an arbitrary stream as a source for MediaPlayer

I would like to use an arbitrary InputStream as a data source for a MediaPlayer object.
The reason for this is that the InputStream I am using is in fact an authorized HTTPS connection to a media resource on a remote server. Passing the URL in that case will obviously not work as an authentication is required. I can however do the authentication separately and get an InputStream to the resource - problem is what do I do once I have it?
I thought about the option of using a named pipe and passing its FileDescriptor to the setDataResource method of MediaPlayer. Is there a way to create named pipes in Android (without using NDK)?
Any other suggestion is most welcome.
I think I have found a solution. I would appreciate it if others who are interested would try this on their own and report the results with their device models and SDK version.
I have seen similar posts which direct to this but I thought I would post it anyway since it is newer and seems to work on newer versions of the SDK - so far it works on my Nexus One running Android 2.3.6.
The solution relies on bufferring the input stream to a local file (I have this file on the external storage but it will probably be possible to place it on the intenal storage as well) and providing that file's descriptor to the MediaPlayer instance.
The following runs in a doInBackground method of some AsyncTask that does AudioPlayback:
#Override
protected
Void doInBackground(LibraryItem... params)
{
...
MediaPlayer player = new MediaPlayer();
setListeners(player);
try {
_remoteStream = getMyInputStreamSomehow();
File tempFile = File.createTempFile(...);
tempFile.deleteOnExit();
_localInStream = new FileInputStream(tempFile);
_localOutStream = new FileOutputStream(tempFile);
int buffered = bufferMedia(
_remoteStream, _localOutStream, BUFFER_TARGET_SIZE // = 128KB for instance
);
player.setAudioStreamType(AudioManager.STREAM_MUSIC);
player.setDataSource(_localInStream.getFD());
player.prepareAsync();
int streamed = 0;
while (buffered >= 0) {
buffered = bufferMedia(
_remoteStream, _localOutStream, BUFFER_TARGET_SIZE
);
}
}
catch (Exception exception) {
// Handle errors as you see fit
}
return null;
}
The bufferMedia method buffers nBytes bytes or until the end of input is reached:
private
int bufferMedia(InputStream inStream, OutputStream outStream, int nBytes)
throws IOException
{
final int BUFFER_SIZE = 8 * (1 << 10);
byte[] buffer = new byte[BUFFER_SIZE]; // TODO: Do static allocation instead
int buffered = 0, read = -1;
while (buffered < nBytes) {
read = inStream.read(buffer);
if (read == -1) {
break;
}
outStream.write(buffer, 0, read);
outStream.flush();
buffered += read;
}
if (read == -1 && buffered == 0) {
return -1;
}
return buffered;
}
The setListeners method sets handlers for various MediaPlayer events. The most important one is the OnCompletionListener which
is invoked when playback is complete. In cases of buffer underrun (due to, say, temporary slow network connection) the player
will reach the end of the local file and transit to the PlaybackCompleted state. I identify those situations by comparing the
position of _localInStream against the size of the input stream. If the position is smaller, then playback is now really completed
and I reset the MediaPlayer:
private
void setListeners(MediaPlayer player)
{
// Set some other listeners as well
player.setOnSeekCompleteListener(
new MediaPlayer.OnSeekCompleteListener()
{
#Override
public
void onSeekComplete(MediaPlayer mp)
{
mp.start();
}
}
);
player.setOnCompletionListener(
new MediaPlayer.OnCompletionListener()
{
#Override
public
void onCompletion(MediaPlayer mp)
{
try {
long bytePosition = _localInStream.getChannel().position();
int timePosition = mp.getCurrentPosition();
int duration = mp.getDuration();
if (bytePosition < _track.size) {
mp.reset();
mp.setDataSource(_localInStream.getFD());
mp.prepare();
mp.seekTo(timePosition);
} else {
mp.release();
}
} catch (IOException exception) {
// Handle errors as you see fit
}
}
}
);
}
Another solution would be to start a proxy HTTP server on localhost. The media player will connect to this server with setDataSource(Context context, Uri uri). This solution works better than the previous and does not cause playback to glitch.

Categories

Resources