I am trying to create a 1v3 or 4v4 conferencing(whatever you call it) android app, i have successfully connected 4 people together using webrtc and socket.io.
But when i disconnect one of the users i get webrtc native crash
Fatal signal 11 (SIGSEGV), code 1, fault addr 0xb8 in tid 17650 (Thread-648)
Same code works in moto c but crashes on other devices.
webrtc version used - compile 'io.pristine:libjingle:9694#aar'
onDisconnect(){
if (null != peerConnection2) {
peerConnection2.removeStream(localMediaStream);
peerConnection2.close();
peerConnection2 = null;
}
if (null != peerConnection3) {
peerConnection3.dispose();
peerConnection3 = null;
}
if (null != localVideoSource) {
localVideoSource.dispose();
localVideoSource = null;
}
if (null != peerConnectionFactory) {
audioManager.setMode(AudioManager.MODE_NORMAL);
audioManager.setSpeakerphoneOn(false);
peerConnectionFactory.dispose(); <<<<--- THIS IS WHERE THE APP CRASHES.
peerConnectionFactory = null;
}
}
I am not sure if updating the version will help me with this bug, even if i go for updating it to new version, i am unable to find any proper documentation or blog related to it, it'll be great if you can point me to any known link(blog/documentaion) for latest version of libjingle.
Related
I made a mini mobile game on Unity3d. It has many scenes (each level in the game is a scene). the levels start from scene number 1 but there is a scene number 0 which contains an empty that has a script attached to it "Level0".
public class Level0 : MonoBehaviour
{
public int levelIndex;
public IEnumerator delayCoroutine()
{
yield return new WaitForSeconds(1f);
SceneManager.LoadSceneAsync(levelIndex);
}
void Awake()
{
levelIndex = SaveLoadManager.loadCurrentLevelIndex();
StartCoroutine(delayCoroutine());
}
}
the role of this script is to call a static function from "SaveLoadManager" which returns the next level index (scene index in the build settings).
public static int loadCurrentLevelIndex()
{
if (File.Exists(Application.persistentDataPath + "/data.txt"))
{
BinaryFormatter bf = new BinaryFormatter();
FileStream stream = new FileStream(Application.persistentDataPath + "/data.txt", FileMode.Open);
PlayerDataSerializable data = bf.Deserialize(stream) as PlayerDataSerializable;
stream.Close();
return data.realCurrentLevel;
}
else
{
return 1;
}
}
so the method gets the level index from the data file but if the file is not found it just returns "1".
Now the problem is that when I build the game to my OPPO A73 with Android 10 and enter the app settings and choose"clear data" and reopen the game, then it crashes at level0. however when I do the same thing with Nokia 3 (Android 9) or Xiaomi Redmi Note 10 (Android 11) the game works fine and loads the first level as expected.
I monitored the errors via Logcat in Android Studio but I didn't get any error from Unity.
Is the problem with my OPPO device? because I got this error in Logcat:
E/DispScene: [/data/oppo/multimedia/oppo_display_perf_list.xml]:open
config fail
Or with Android 10?
I'm working auto call recorder app, I'm able to record voice call on below android 6 using MediaRecorder.AudioSource.VOICE_CALL,
From android 6 not able to record voice call using VOICE_CALL. I managed to record using MediaRecorder.AudioSource.MIC but here incoming voice not getting recorded and I want to record voice call in normal mode not in speaker on mode. Please help me on this. (I had tried on Xiomi Redmi 4a(android 6),not working).
myRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
myRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
myRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
myRecorder.setMaxDuration(60 * 60 * 1000);
AudioManager audiomanager =
(AudioManager)getSystemService(AUDIO_SERVICE);
audiomanager.setMode(2);
Edit : There is no issue with permissions.
Update : Anyone knows how to forcing another stream to MIC audio source. This requires native android code. Please help me on this
Refer this question for more details on routing audio
You need to use ndk. Here are examples of the functions that need to be done.
Load libmedia.so and libutils.so
int load(JNIEnv *env, jobject thiz) {
void *handleLibMedia;
void *handleLibUtils;
int result = -1;
lspr func = NULL;
pthread_t newthread = (pthread_t) thiz;
handleLibMedia = dlopen("libmedia.so", RTLD_NOW | RTLD_GLOBAL);
if (handleLibMedia != NULL) {
func = dlsym(handleLibMedia, "_ZN7android11AudioSystem13setParametersEiRKNS_7String8E");
if (func != NULL) {
result = 0;
}
audioSetParameters = (lasp) func;
} else {
result = -1;
}
handleLibUtils = dlopen("libutils.so", RTLD_NOW | RTLD_GLOBAL);
if (handleLibUtils != NULL) {
fstr = dlsym(handleLibUtils, "_ZN7android7String8C2EPKc");
if (fstr == NULL) {
result = -1;
}
} else {
result = -1;
}
cmd = CM_D;
int resultTh = pthread_create(&newthread, NULL, taskAudioSetParam, NULL);
return result;}
Function setParameters
int setParam(jint i, jint as) {
pthread_mutex_lock(&mt);
audioSession = (int) (as + 1);
kvp = "input_source=4";
kvps = toString8(kvp);
cmd = (int) i;
pthread_cond_signal(&cnd);
pthread_mutex_unlock(&mt);
return 0;}
Task AudioSetParameters
void *taskAudioSetParam(void *threadid) {
while (1) {
pthread_mutex_lock(&mt);
if (cmd == CM_D) {
pthread_cond_wait(&cnd, &mt);
} else if (audioSetParameters != NULL) {
audioSetParameters(audioSession, kvps);
}
pthread_mutex_unlock(&mt);
}
}
There is a library and an example of use https://github.com/ViktorDegtyarev/CallRecLib
Xiaomi devices always have problems with permission request even run-time or install-time.
I have an Xiaomi Redmi 3 pro, and it always force to Deny some permission when I install apps, so I must manually Allow it.
If your problem is the same, I found some workaround solution and it worked for me: How to get MIUI Security app auto start permission programmatically?
First these 3 permissions are needed in Manifest as well as a runtime permission request if the device is above Marshmallow,
<uses-permission android:name="android.permission.READ_PHONE_STATE" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.CAPTURE_AUDIO_OUTPUT" />
MediaRecorder.AudioSource.VOICE_CALL is not supported on all phones so you need to continue using MediaRecorder.AudioSource.MIC.
I use this and works fine on most of the devices,
recorder = new MediaRecorder();
recorder.setAudioSource(audioSource);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recorder.setOutputFile(your_path);
You need to set this to record your calls properly,
audioManager.setMode(AudioManager.MODE_IN_CALL);
raise volume level when you start recording
audioManager.setStreamVolume(AudioManager.STREAM_VOICE_CALL,audioManager.getStreamMaxVolume(AudioManager.STREAM_VOICE_CALL), 0);
When you stop recording set the mode to normal,
audioManager.setMode(AudioManager.MODE_NORMAL); and also set the stream volume to back how it was.
This could be a Permission related issue.
With the introduction of Android 6.0 Marshmallow, the app will not be granted any permission at installation time. Instead, the application has to ask the user for a permission one-by-one at run-time.
I hope you have included the code which explicitly asks for permissions on devices with Marshmallow and above.
In automatic call recorder (callU) have a option "SoundFX" If Enable Record Calls Two Side
Link
try
MediaRecorder.AudioSource.VOICE_COMMUNICATION
and see
https://androidforums.com/threads/android-phone-with-call-recording-function.181663/
I have been trying to implement the flashlight/torch feature of the camera using the GooglePlay Services Vision API (using Nuget from Visual Studio) for the past few days without success. I have noticed that there is a GitHub implementation of this API which has such functionality but that is only available to Java users.
I was wondering if there is anything related to C# Xamarin users.
The Camera object is not made available on this API therefore I am not able to alter the Camera parameters needed to activate the flashlight.
I would like to be sure if that functionality is not available so I don't waste more time over this. It just might be the case that the Xamarin developers have not attended to this functionality and they might in a near future.
UPDATE
https://github.com/googlesamples/android-vision/blob/master/visionSamples/barcode-reader/app/src/main/java/com/google/android/gms/samples/vision/barcodereader/BarcodeCaptureActivity.java
In there you can see that on line 214 we have such method call:
mCameraSource = builder.setFlashMode(useFlash ? Camera.Parameters.FLASH_MODE_TORCH : null).build();
SetFlashMode is not a method of the CameraSource in Nuget, but it is on the GitHub (open source version).
Xamarin Vision Library Didn't expose the method to set Flash Mode.
WorkAround.
Using Reflection. You can get the Camera Object from CameraSouce and add the flash parameter then set the updated parameters to the camera.
This should be called after surfaceview has been created
Code
public Camera getCameraObject (CameraSource _camSource)
{
Field [] cFields = _camSource.Class.GetDeclaredFields ();
Camera _cam = null;
try {
foreach (Field item in cFields) {
if (item.Name.Equals ("zzbNN")) {
Console.WriteLine ("Camera");
item.Accessible = true;
try {
_cam = (Camera)item.Get (_camSource);
} catch (Exception e) {
Logger.LogException (this, e);
}
}
}
} catch (Exception e) {
Logger.LogException (this, e);
}
return _cam;
}
public void setFlash (bool isEnable)
{
try {
isTorch = !isEnable;
var _cam = getCameraObject (mCameraSource);
if (_cam == null) return;
var _pareMeters = _cam.GetParameters ();
var _listOfSuppo = _cam.GetParameters ().SupportedFlashModes;
_pareMeters.FlashMode = isTorch ? _listOfSuppo [0] : _listOfSuppo [3];
_cam.SetParameters (_pareMeters);
} catch (Exception e) {
Logger.LogException (this, e);
}
}
Basically, anything you can do with Android can be done with Xamarin.Android. All the underlying APIs area available.
Since you have existing Java code, you can create a binding project that enables you to call the code from your Xamarin.Android project. Here's a good article on how to get started: Binding a Java Library
On the other hand, I don't think you need a library to do what you want to. If you only want torch/flashlight functionality, you just need to adapt the Java code from this answer to work in Xamarin.Android with C#.
I try to use Gstreamer on Android via Qt C++.
I already use Gstreamer on these platforms but now I have an issues with the plugins:
G_BEGIN_DECLS
GST_PLUGIN_STATIC_DECLARE(coreelements);
GST_PLUGIN_STATIC_DECLARE(audioconvert);
GST_PLUGIN_STATIC_DECLARE(playback);
G_END_DECLS
void MainWindow::play(){
GST_PLUGIN_STATIC_REGISTER(coreelements);
GST_PLUGIN_STATIC_REGISTER(audioconvert);
GST_PLUGIN_STATIC_REGISTER(playback);
GstElement *pipeline;
GError *error = NULL;
pipeline = gst_parse_launch("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-368p.ogv", &error);
if (!pipeline) {
ui->label->setText("error");
return;
}
if(error != NULL){
qDebug("GST error: ");
qDebug(error->message);
}else{
qDebug("GST without errors");
}
gst_element_set_state(pipeline, GST_STATE_READY);
gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(pipeline), this->ui->playback_widget->winId());
gst_element_set_state(pipeline, GST_STATE_PLAYING);
ui->label->setText("Playing...");
}
After executing of this code I don't get either video in the playback_widget or the audio, but error var is clear(equals NULL) and label set to "Playing...". So, maybe I missed something?
OpenSLES SLPlaybackRateItf SetRate works on Android 4.x, but not on Android 5.x.
It seems as if the SetRate interface has no implementation on Android 5.x.
Below are some code snippets, and the complete code can be found here: https://github.com/jimrange/cocos2d-x/blob/audioEngineUpdate/cocos/audio/android/AudioEngine-inl.cpp
SLPlaybackRateItf _fdPlayerPlaybackRate;
SLObjectItf _fdPlayerObject;
Other initialization code not shown for brevity.
// get the playback rate interface
result = (*_fdPlayerObject)->GetInterface(_fdPlayerObject, SL_IID_PLAYBACKRATE, &_fdPlayerPlaybackRate);
if(SL_RESULT_SUCCESS != result){ ERRORLOG("get the rate interface failed"); break; }
SLpermille stepSize;
SLuint32 capabilities;
auto rangeResult = (*_fdPlayerPlaybackRate)->GetRateRange(_fdPlayerPlaybackRate, 0, &_minRate, &_maxRate, &stepSize, &capabilities);
if(SL_RESULT_SUCCESS != rangeResult)
{
log("%s error:%u",__func__, rangeResult);
}
At this point
_minRate == 500
_maxRate == 2000
stepSize == 0
capabilities == SL_RATEPROP_NOPITCHCORAUDIO
Later when calling the following, the result is SL_RESULT_SUCCESS, but the playback rate does not change.
void AudioEngineImpl::setPitch(int audioID,float pitch)
{
auto& player = _audioPlayers[audioID];
SLpermille playbackRate = (SLpermille)1000 * pitch;
if (playbackRate < player._minRate)
{
log("Warning: AudioEngine attempting to set rate:%d which is lower than minRate=%d.", playbackRate, player._minRate);
playbackRate = player._minRate;
}
else if (playbackRate > player._maxRate)
{
log("Warning: AudioEngine attempting to set rate:%d which is higher than maxRate=%d.", playbackRate, player._maxRate);
playbackRate = player._maxRate;
}
// This works on Android 4.4.4 on Nexus 7, but not on Android 5.x on Nexus 7 or Nexus 9.
auto result = (*player._fdPlayerPlaybackRate)->SetRate(player._fdPlayerPlaybackRate, playbackRate);
if(SL_RESULT_SUCCESS != result)
{
log("%s error:%u",__func__, result);
}
}
Here is what I am experiencing:
Requesting the SL_IID_PLAYBACKRATE interface returns SL_RESULT_SUCCESS.
I can query the interface for capabilities and I get a range of 500 to 2000, a step size of 0 and a capabilities flag of SL_RATEPROP_NOPITCHCORAUDIO. This is the same on Android 4.x and 5.x.
So the interface is telling me all is setup good and ready to go.
Calling SetRate returns SL_RESULT_SUCCESS on Android 4.x and Android 5.x.
But on 5.x the SetRate does not have any effect on the audio.
There is another SO post that suggests using SL_RATEPROP_PITCHCORAUDIO, but I want to use SL_RATEPROP_NOPITCHCORAUDIO so that there is no pitch correction when changing the rate of the audio. And SL_RATEPROP_NOPITCHCORAUDIO is the default setting, so no need to change that.
From the OpenSL_ES_Specification_1.0.1.pdf:
SL_RATEPROP_NOPITCHCORAUDIO - Plays audio at the current rate, but without pitch correction.
SL_RATEPROP_PITCHCORAUDIO - Plays audio at the current rate, but with pitch correction.
I am working on code to expand the Cocos2d-x AudioEngine.
The Cocos2d-x version is 3.7rc0 and the NDK version is ndk-r10d.