Automatically show the currently talking user in OpenTok - android

I'm currently working on an Android app that enables users to group-chat with each other via OpenTok API. And I want to add a feature to the app that automatically detects which user is talking right now and show his video to the others and minimize the other users' videos until someone else talks.
I cannot find such feature in OpenTok so I was wondering if there's a workaround.
private void joinVideoCall(String sessionId, String sessionToken) {
session = new Session.Builder(activity, OPENTOK_API_KEY, sessionId).build();
session.setSessionListener(this);
session.connect(sessionToken);
}
#Override
public void onConnected(Session session) {
publisher = new Publisher.Builder(activity).build();
publisher.setPublisherListener(this);
publisherView.addView(publisher.getView());
session.publish(publisher);
}
#Override
public void onStreamReceived(Session session, Stream stream) {
subscriber = new Subscriber.Builder(activity, stream).build();
session.subscribe(subscriber);
subscriberView.addView(subscriber.getView());
}
...

In order to do that, you'll need to use a custom audio driver that will detect the audio levels.
Take a look to this sample: https://github.com/opentok/opentok-android-sdk-samples/tree/master/Custom-Audio-Driver
And also, take a look to the API documentation: https://tokbox.com/developer/sdks/android/reference/com/opentok/android/BaseAudioDevice.html

Related

How to know if a given device is in the list of supported devices for ARCore?

I am using Google CoreAR package in my React-Native app for AR support. There are some devices which support AR and some not. I am getting error while I run the application in non-supported devices. I want to render a message instead showing error on the screen. For this Google CoreAR package is providing the solution which is not working for me.
void maybeEnableArButton() {
ArCoreApk.Availability availability = ArCoreApk.getInstance().checkAvailability(this);
if (availability.isTransient()) {
// Continue to query availability at 5Hz while compatibility is checked in the background.
new Handler().postDelayed(new Runnable() {
#Override
public void run() {
maybeEnableArButton();
}
}, 200);
}
if (availability.isSupported()) {
mArButton.setVisibility(View.VISIBLE);
mArButton.setEnabled(true);
} else { // The device is unsupported or unknown.
mArButton.setVisibility(View.INVISIBLE);
mArButton.setEnabled(false);
}
}
The problem with above code snippet is that availability.isSupported() is always returning true and that's why else part of code is not running. Can you guys please help me with this?
Thank you.
I found solution for this problem. ArCoreApk.Availability has some methods which can be used. You can find these methods in the
documentation. The method ArCoreApk.Availability
return either SUPPORTED_INSTALLED or SUPPORTED_NOT_INSTALLED depending on device support. So based on this return value we can do the stuff.
I did like this.
#ReactMethod
public ArCoreApk.Availability getSupport(){
ArCoreApk.Availability availability = ArCoreApk.getInstance().checkAvailability(this.getReactApplicationContext());
return availability.name();
}

Pause background service in Xamarin.Forms

I have an application that occasionally speaks via the systems text to speech(TTS) system, but if there's a background service (like an audiobook, or music stream) running at the same time they overlap.
I would like to pause the media, play my TTS, then unpause the media. I've looked, but can't find any solutions.
I believe if I were to play actual audio from my app, it would pause the media until my playback was complete (if I understand what I've found correctly). But TTS doesn't seem to have the same affect. The speech is totally dynamic, so I can't just record all the options.
Using the latest Xamarin.Forms, I've looked into all the media nuget packages I could find, and they all seem pretty centered on controlling media from files.
My only potential thought (I don't like it), is to maybe play an empty audio file while the TTS is running. But would like a more elegant solution if it exists.
(I don't care about iOS at the moment, so if it's an android only solution, I'm okay with it. And if it's native (java/kotlin), I can convert/incorporate it.)
Agree with rbonestell said, you can use DependencyService and AudioFocus to achieve it, when you record the audio, you can create interface in PCL.
public interface IControl
{
void StopBackgroundMusic();
}
When you record the audio, you can executed the DependencyService with following code.
private void Button_Clicked(object sender, EventArgs e)
{
DependencyService.Get<IControl>().StopBackgroundMusic();
//record the audio
}
In android folder, you can create a StopMusicService to achieve that.
[assembly: Dependency(typeof(StopMusicService))]
namespace TTSDemo.Droid
{
public class StopMusicService : IControl
{
AudioManager audioMan;
AudioManager.IOnAudioFocusChangeListener listener;
public void StopBackgroundMusic()
{
audioMan = (AudioManager)Android.App.Application.Context.GetSystemService(Context.AudioService);
listener = new MyAudioListener(this);
var ret = audioMan.RequestAudioFocus(listener, Stream.Music, AudioFocus.Gain);
}
}
internal class MyAudioListener :Java.Lang.Object, AudioManager.IOnAudioFocusChangeListener
{
private StopMusicService stopMusicService;
public MyAudioListener(StopMusicService stopMusicService)
{
this.stopMusicService = stopMusicService;
}
public void OnAudioFocusChange([GeneratedEnum] AudioFocus focusChange)
{
// throw new NotImplementedException();
}
}
}
Thanks to Leon Lu - MSFT, I was able to go in the right direction. I took his implementation (which has some deprecated calls to the Android API), and updated it for what I needed.
I'll be doing a little more work making sure it's stable and functional. I'll also see if I can clean it up a little too. But here's what works on my first test:
[assembly: Dependency(typeof(MediaService))]
namespace ...Droid.Services
{
public class MediaService : IMediaService
public async Task PauseBackgroundMusicForTask(Func<Task> onFocusGranted)
{
var manager = (AudioManager)Android.App.Application.Context.GetSystemService(Context.AudioService);
var builder = new AudioFocusRequestClass.Builder(AudioFocus.GainTransientMayDuck);
var focusRequest = builder.Build();
var ret = manager.RequestAudioFocus(focusRequest);
if (ret == AudioFocusRequest.Granted)
{
await onFocusGranted?.Invoke();
manager.AbandonAudioFocusRequest(focusRequest);
}
}
}
}

How to cast media to default media receiver of the Chromecast device?

Trying to cast media to default media receiver of the Chromecast from android app, but it doesn't cast at all. Following is the code snippet using to find the routes :
MediaRouteSelector selector = new MediaRouteSelector.Builder()
.addControlCategory(CastMediaControlIntent
.categoryForRemotePlayback(CastMediaControlIntent.DEFAULT_MEDIA_RECEIVER_APPLICATION_ID))
.build();
Then, it will show the Chromecast device within the WiFi and running the following code when the device is selected :
MediaRouter.RouteInfo route = adapter.getItem(position).routeInfo;
// select the route for usage
route.select();
// send the play control request with the video uri
route.sendControlRequest(
new Intent(MediaControlIntent.ACTION_PLAY)
.setDataAndType(videoUri, "video/mp4")
.addCategory(MediaControlIntent.CATEGORY_REMOTE_PLAYBACK),
new MediaRouter.ControlRequestCallback() {
#Override
public void onError(String error, Bundle data) {
super.onError(error, data);
}
#Override
public void onResult(Bundle data) {
super.onResult(data);
}
}
);
It can't cast the media to the device. Any suggestions ?
It seems like you are not using the Cast SDK but using the Media Route Provider. I don't see any session being set up; you might want to look at the democastplayer sample code that is distributed along with the Android SDK (under SDK folder, go to extras/google/google_play_service/samples/cast/democastplayer). In that sample, look at the MrpCastPlayerActivity class.

Built.io upload progress - android

I want to show progress of my file upload. I use built.io to store my files, but really don't know how I can show progress, because I think built.io does not support this.
This is how I send file:
final BuiltFile builtFileObject = new BuiltFile();
builtFileObject.setFile(tempFile.getPath());
builtFileObject.save(new BuiltResultCallBack() {
#Override
public void onSuccess() {
}
#Override
public void onError(BuiltError builtError) {
Toast.makeText(getContext(),"blad zapisu pliku",Toast.LENGTH_SHORT).show();
}
#Override
public void onAlways() {
}
});
This is not possible using the current SDK provided by built.io. There is a plan to support this in the future.
A workaround is to call built.io using the REST APIs (check out the REST section on https://docs.built.io/guide#uploads). The regular HTTP upload call can then be tracked to get the progress detail.
Please mail us at support#built.io if you have any more queries with this! Glad to help.

Search DLNA(Upnp) media renderers for remoteplayback

I am trying to write a DLNA application using the Cling Java library. I Can able to search all the media servers in the DLNA network and play the content also. But i need to search Media Renderers available in the network and play the content on them. Just like UPnPlay does.
Thanks in advance.
public class MyUpnpService extends AndroidUpnpServiceImpl {
#Override
protected AndroidUpnpServiceConfiguration createConfiguration(WifiManager wifiManager) {
return new AndroidUpnpServiceConfiguration(wifiManager) {
#Override
public ServiceType[] getExclusiveServiceTypes() {
return new ServiceType[] {
new UDAServiceType("AVTransport")
};
}
};
}
}
Searching for devices with "AVTransport service" capability solved the issue of searching for Media renderers for remote playback. For remote playback i found enough documnetation from this

Categories

Resources