I am trying to write a DLNA application using the Cling Java library. I Can able to search all the media servers in the DLNA network and play the content also. But i need to search Media Renderers available in the network and play the content on them. Just like UPnPlay does.
Thanks in advance.
public class MyUpnpService extends AndroidUpnpServiceImpl {
#Override
protected AndroidUpnpServiceConfiguration createConfiguration(WifiManager wifiManager) {
return new AndroidUpnpServiceConfiguration(wifiManager) {
#Override
public ServiceType[] getExclusiveServiceTypes() {
return new ServiceType[] {
new UDAServiceType("AVTransport")
};
}
};
}
}
Searching for devices with "AVTransport service" capability solved the issue of searching for Media renderers for remote playback. For remote playback i found enough documnetation from this
Related
I am using Google CoreAR package in my React-Native app for AR support. There are some devices which support AR and some not. I am getting error while I run the application in non-supported devices. I want to render a message instead showing error on the screen. For this Google CoreAR package is providing the solution which is not working for me.
void maybeEnableArButton() {
ArCoreApk.Availability availability = ArCoreApk.getInstance().checkAvailability(this);
if (availability.isTransient()) {
// Continue to query availability at 5Hz while compatibility is checked in the background.
new Handler().postDelayed(new Runnable() {
#Override
public void run() {
maybeEnableArButton();
}
}, 200);
}
if (availability.isSupported()) {
mArButton.setVisibility(View.VISIBLE);
mArButton.setEnabled(true);
} else { // The device is unsupported or unknown.
mArButton.setVisibility(View.INVISIBLE);
mArButton.setEnabled(false);
}
}
The problem with above code snippet is that availability.isSupported() is always returning true and that's why else part of code is not running. Can you guys please help me with this?
Thank you.
I found solution for this problem. ArCoreApk.Availability has some methods which can be used. You can find these methods in the
documentation. The method ArCoreApk.Availability
return either SUPPORTED_INSTALLED or SUPPORTED_NOT_INSTALLED depending on device support. So based on this return value we can do the stuff.
I did like this.
#ReactMethod
public ArCoreApk.Availability getSupport(){
ArCoreApk.Availability availability = ArCoreApk.getInstance().checkAvailability(this.getReactApplicationContext());
return availability.name();
}
I have an application that occasionally speaks via the systems text to speech(TTS) system, but if there's a background service (like an audiobook, or music stream) running at the same time they overlap.
I would like to pause the media, play my TTS, then unpause the media. I've looked, but can't find any solutions.
I believe if I were to play actual audio from my app, it would pause the media until my playback was complete (if I understand what I've found correctly). But TTS doesn't seem to have the same affect. The speech is totally dynamic, so I can't just record all the options.
Using the latest Xamarin.Forms, I've looked into all the media nuget packages I could find, and they all seem pretty centered on controlling media from files.
My only potential thought (I don't like it), is to maybe play an empty audio file while the TTS is running. But would like a more elegant solution if it exists.
(I don't care about iOS at the moment, so if it's an android only solution, I'm okay with it. And if it's native (java/kotlin), I can convert/incorporate it.)
Agree with rbonestell said, you can use DependencyService and AudioFocus to achieve it, when you record the audio, you can create interface in PCL.
public interface IControl
{
void StopBackgroundMusic();
}
When you record the audio, you can executed the DependencyService with following code.
private void Button_Clicked(object sender, EventArgs e)
{
DependencyService.Get<IControl>().StopBackgroundMusic();
//record the audio
}
In android folder, you can create a StopMusicService to achieve that.
[assembly: Dependency(typeof(StopMusicService))]
namespace TTSDemo.Droid
{
public class StopMusicService : IControl
{
AudioManager audioMan;
AudioManager.IOnAudioFocusChangeListener listener;
public void StopBackgroundMusic()
{
audioMan = (AudioManager)Android.App.Application.Context.GetSystemService(Context.AudioService);
listener = new MyAudioListener(this);
var ret = audioMan.RequestAudioFocus(listener, Stream.Music, AudioFocus.Gain);
}
}
internal class MyAudioListener :Java.Lang.Object, AudioManager.IOnAudioFocusChangeListener
{
private StopMusicService stopMusicService;
public MyAudioListener(StopMusicService stopMusicService)
{
this.stopMusicService = stopMusicService;
}
public void OnAudioFocusChange([GeneratedEnum] AudioFocus focusChange)
{
// throw new NotImplementedException();
}
}
}
Thanks to Leon Lu - MSFT, I was able to go in the right direction. I took his implementation (which has some deprecated calls to the Android API), and updated it for what I needed.
I'll be doing a little more work making sure it's stable and functional. I'll also see if I can clean it up a little too. But here's what works on my first test:
[assembly: Dependency(typeof(MediaService))]
namespace ...Droid.Services
{
public class MediaService : IMediaService
public async Task PauseBackgroundMusicForTask(Func<Task> onFocusGranted)
{
var manager = (AudioManager)Android.App.Application.Context.GetSystemService(Context.AudioService);
var builder = new AudioFocusRequestClass.Builder(AudioFocus.GainTransientMayDuck);
var focusRequest = builder.Build();
var ret = manager.RequestAudioFocus(focusRequest);
if (ret == AudioFocusRequest.Granted)
{
await onFocusGranted?.Invoke();
manager.AbandonAudioFocusRequest(focusRequest);
}
}
}
}
I'm currently working on an Android app that enables users to group-chat with each other via OpenTok API. And I want to add a feature to the app that automatically detects which user is talking right now and show his video to the others and minimize the other users' videos until someone else talks.
I cannot find such feature in OpenTok so I was wondering if there's a workaround.
private void joinVideoCall(String sessionId, String sessionToken) {
session = new Session.Builder(activity, OPENTOK_API_KEY, sessionId).build();
session.setSessionListener(this);
session.connect(sessionToken);
}
#Override
public void onConnected(Session session) {
publisher = new Publisher.Builder(activity).build();
publisher.setPublisherListener(this);
publisherView.addView(publisher.getView());
session.publish(publisher);
}
#Override
public void onStreamReceived(Session session, Stream stream) {
subscriber = new Subscriber.Builder(activity, stream).build();
session.subscribe(subscriber);
subscriberView.addView(subscriber.getView());
}
...
In order to do that, you'll need to use a custom audio driver that will detect the audio levels.
Take a look to this sample: https://github.com/opentok/opentok-android-sdk-samples/tree/master/Custom-Audio-Driver
And also, take a look to the API documentation: https://tokbox.com/developer/sdks/android/reference/com/opentok/android/BaseAudioDevice.html
Trying to cast media to default media receiver of the Chromecast from android app, but it doesn't cast at all. Following is the code snippet using to find the routes :
MediaRouteSelector selector = new MediaRouteSelector.Builder()
.addControlCategory(CastMediaControlIntent
.categoryForRemotePlayback(CastMediaControlIntent.DEFAULT_MEDIA_RECEIVER_APPLICATION_ID))
.build();
Then, it will show the Chromecast device within the WiFi and running the following code when the device is selected :
MediaRouter.RouteInfo route = adapter.getItem(position).routeInfo;
// select the route for usage
route.select();
// send the play control request with the video uri
route.sendControlRequest(
new Intent(MediaControlIntent.ACTION_PLAY)
.setDataAndType(videoUri, "video/mp4")
.addCategory(MediaControlIntent.CATEGORY_REMOTE_PLAYBACK),
new MediaRouter.ControlRequestCallback() {
#Override
public void onError(String error, Bundle data) {
super.onError(error, data);
}
#Override
public void onResult(Bundle data) {
super.onResult(data);
}
}
);
It can't cast the media to the device. Any suggestions ?
It seems like you are not using the Cast SDK but using the Media Route Provider. I don't see any session being set up; you might want to look at the democastplayer sample code that is distributed along with the Android SDK (under SDK folder, go to extras/google/google_play_service/samples/cast/democastplayer). In that sample, look at the MrpCastPlayerActivity class.
I want to make Radio Web App Android-browser-based.
The key function for my Web App is continuously playing music-list
(mp3 files) ...
JavaScript Code is simple ...
And it works well in PC Browser, iOS safari, and Android Dolphin
Browser (3rd party browser)...
var audioPlayer = new Audio();
audioPlayer.addEventListener('ended', nextSong, false);
function nextSong() {
audioPlayer.src = nextMusic_src;
}
But. in Android Default Browser,
when android is in background mode (Home Screen mode & LCD-off Sleep
mode),
Android Browser's onPause() prevent "ended" event & execution of
nextSong().
So, my web app can play only one music.... and does not work any
more.....
Android Browser's onPause() source code is like this ...
BrowserActivity.java
public class BrowserActivity extends Activity
{
private Controller mController;
// ...
#Override
protected void onPause() {
if (mController != null) {
mController.onPause(); // <<==== here
}
super.onPause();
}
}
Controller.java
public class Controller implements WebViewController, UiController {
protected void onPause() {
// ...
mActivityPaused = true;
// ...
mUi.onPause();
mNetworkHandler.onPause(); // <<==== here
WebView.disablePlatformNotifications(); // <<====here
}
}
NetworkStateHandler.java
public class NetworkStateHandler {
// ...
void onPause() {
// unregister network state listener
mActivity.unregisterReceiver(mNetworkStateIntentReceiver); // <<==== here
}
}
Is there any browser policy for preventing event in background
mode....?
If not, how can I notice this report to the Google developer for
requesting background music play with web app...?
I consider not WebView based web-app (Hybrid) but only browser-based
web-app...
thank you...
Nohyun Kwak
Droid's ended event doesn't fire consistently. It's fixed in ICS, I hear. But Gingerbread and prior have the issue.
Instead of listening for .ended, can you listen for .timeupdate and in that handler, see if the time is the end of the song? That may solve your issue.