My Android app uses the Remote Display API to cast video to the user's Cast-enabled device. Unfortunately, we have to use a proprietary video player, hence why I can't use the normal video API. This is sadly out of my control.
The app waits for the user to select a Display and launch a video. I would say that it works well approximately 50% of the time. Often, audio will cease to play while the video continues. Sometimes (but more rarely) the opposite happens- the Cast screen turns black while audio continues. And there is often audio and video skipping while the video plays, which isn't experienced when just viewing on the device.
I gather playing video over Remote Display isn't ideal, but I'd think audio and video should continue streaming throughout, especially since Remote Display was created for graphic-intensive games in-mind.
Also, the fact that audio stops when the activity is pushed into the background is a bit of a deal-breaker. Are there any plans to change this?
Here are some pieces of code that show how I'm creating the connection and starting video. Maybe I'm doing something dumb that causes it to perform poorly?
This code is called when the user selects a device from the MediaRouteChooserDialog:
#Override
public void onRouteSelected(MediaRouter router, MediaRouter.RouteInfo info) {
selectedDevice = CastDevice.getFromBundle(info.getExtras());
Intent intent = new Intent(mainActivity,
ExampleMainActivity.class);
intent.setFlags(Intent.FLAG_ACTIVITY_CLEAR_TOP | Intent.FLAG_ACTIVITY_SINGLE_TOP);
intent.putExtra("cast", true);
PendingIntent notificationPendingIntent = PendingIntent.getActivity(mainActivity, 0, intent, 0);
CastRemoteDisplayLocalService.NotificationSettings settings =
new CastRemoteDisplayLocalService.NotificationSettings.Builder()
.setNotificationPendingIntent(notificationPendingIntent).build();
CastRemoteDisplayLocalService.startService(
mainActivity,
ExamplePresentationService.class,
config.getCastId(castButton.getContext()),
selectedDevice,
settings,
new CastRemoteDisplayLocalService.Callbacks() {
#Override
public void onRemoteDisplaySessionStarted(CastRemoteDisplayLocalService service) {
Log.d(TAG, "onServiceStarted");
}
#Override
public void onRemoteDisplaySessionError(Status errorReason) {
Log.d(TAG, "onServiceError: " + errorReason.getStatusCode());
}
}
);
}
My CastRemoteDisplayLocalService creates the CastPresentation in its createPresentation method:
#TargetApi(17)
private void createPresentation(Display display) {
dismissPresentation();
mPresentation = new PresentationPlayer(this, display, castHelper, adManager);
try {
mPresentation.show();
//mMediaPlayer.start();
} catch (WindowManager.InvalidDisplayException ex) {
Log.e(TAG, "Unable to show presentation, display was removed.", ex);
dismissPresentation();
}
}
And when the user selects a video, the following code is executed in CastPresentation:
public void startVideo(VideoData data) {
FrameLayout videoBase = (FrameLayout)findViewById(R.id.cast_video_frame);
videoBase.setVisibility(View.VISIBLE);
toggleLogoScreen(false);
if (player != null) {
player.stop();
player.close();
player = null;
videoBase.setVisibility(View.VISIBLE);
}
player = CvpPlayer.create(PlayerConstants.PlayerType.NEXSTREAM, castHelper.getExampleMainActivity(), videoBase);
player.setPlayerListener(this);
player.initPlayer();
}
Any ideas are greatly appreciated.
Related
Just a quick question about the Google speech capture inbuilt on most Android devices (not the cloud service) - see the attached image. Is there any way, either programatically or via the settings, to control how long it waits until it asks you to try again? The 'complete silence' fields in the RecognizerIntent don't seem to make any difference. On certain devices it times out very quickly and the user doesn't have enough time to start speaking.
This is the code in my test app:
public void StartSpeechToText(ISpeechResultCallback callback)
{
string rec = global::Android.Content.PM.PackageManager.FeatureMicrophone;
if (rec == "android.hardware.microphone")
{
MainActivity activity = MainActivity.CurrentActivity;
activity.Callback = callback;
var voiceIntent = new Intent(RecognizerIntent.ActionRecognizeSpeech);
//var voiceIntent = new Intent(RecognizerIntent.ActionVoiceSearchHandsFree);
voiceIntent.PutExtra(RecognizerIntent.ExtraLanguageModel, RecognizerIntent.LanguageModelFreeForm);
voiceIntent.PutExtra(RecognizerIntent.ExtraPrompt, "Speak now");
voiceIntent.PutExtra(RecognizerIntent.ExtraSpeechInputCompleteSilenceLengthMillis, 1500);
voiceIntent.PutExtra(RecognizerIntent.ExtraSpeechInputPossiblyCompleteSilenceLengthMillis, 1500);
voiceIntent.PutExtra(RecognizerIntent.ExtraSpeechInputMinimumLengthMillis, 15000);
voiceIntent.PutExtra(RecognizerIntent.ExtraMaxResults, 1);
voiceIntent.PutExtra(RecognizerIntent.ExtraLanguage, Java.Util.Locale.Default);
activity.StartActivityForResult(voiceIntent, VOICE);
}
}
I'm currently working on an Android app that enables users to group-chat with each other via OpenTok API. And I want to add a feature to the app that automatically detects which user is talking right now and show his video to the others and minimize the other users' videos until someone else talks.
I cannot find such feature in OpenTok so I was wondering if there's a workaround.
private void joinVideoCall(String sessionId, String sessionToken) {
session = new Session.Builder(activity, OPENTOK_API_KEY, sessionId).build();
session.setSessionListener(this);
session.connect(sessionToken);
}
#Override
public void onConnected(Session session) {
publisher = new Publisher.Builder(activity).build();
publisher.setPublisherListener(this);
publisherView.addView(publisher.getView());
session.publish(publisher);
}
#Override
public void onStreamReceived(Session session, Stream stream) {
subscriber = new Subscriber.Builder(activity, stream).build();
session.subscribe(subscriber);
subscriberView.addView(subscriber.getView());
}
...
In order to do that, you'll need to use a custom audio driver that will detect the audio levels.
Take a look to this sample: https://github.com/opentok/opentok-android-sdk-samples/tree/master/Custom-Audio-Driver
And also, take a look to the API documentation: https://tokbox.com/developer/sdks/android/reference/com/opentok/android/BaseAudioDevice.html
I have been able to successfully cast video to a Chromecast and have the option let the video play when disconnecting and it all works great. However, if I choose to quit the application and let the video continue playing and then try to re-join the currently playing session and try to use the RemoteMediaPlayer to control the video I am getting: "java.lang.IllegalStateException: No current media session".
Just as a background, I am saving the route id and session id on the initial connect into preferences and am able to successfully call "Cast.CastApi.joinApplication" and when in the onResult I am recreating the Media Channel and setting the setMessageReceivedCallbacks like so:
Cast.CastApi.joinApplication(mApiClient,"xxxxxxxx",persistedSessionId).setResultCallback(new ResultCallback<Cast.ApplicationConnectionResult>() {
#Override
public void onResult(Cast.ApplicationConnectionResult applicationConnectionResult) {
Status status = applicationConnectionResult.getStatus();
if (status.isSuccess()) {
mRemoteMediaPlayer = new RemoteMediaPlayer();
mRemoteMediaPlayer.setOnStatusUpdatedListener(
new RemoteMediaPlayer.OnStatusUpdatedListener() {
#Override
public void onStatusUpdated() {
Log.d("----Chromecast----", "in onStatusUpdated");
}
});
mRemoteMediaPlayer.setOnMetadataUpdatedListener(
new RemoteMediaPlayer.OnMetadataUpdatedListener() {
#Override
public void onMetadataUpdated() {
Log.d("----Chromecast----", "in onMetadataUpdated");
}
});
try {
Cast.CastApi.setMessageReceivedCallbacks(mApiClient,mRemoteMediaPlayer.getNamespace(), mRemoteMediaPlayer);
} catch (IOException e) {
Log.e("----Chromecast----", "Exception while creating media channel", e);
}
//-----------RESOLUTION START EDIT------------------
mRemoteMediaPlayer.requestStatus(mApiClient).setResultCallback(new ResultCallback<RemoteMediaPlayer.MediaChannelResult>() {
#Override
public void onResult(RemoteMediaPlayer.MediaChannelResult mediaChannelResult) {
Status stat = mediaChannelResult.getStatus();
if(stat.isSuccess()){
Log.d("----Chromecast----", "mMediaPlayer getMediaStatus success");
// Enable controls
}else{
Log.d("----Chromecast----", "mMediaPlayer getMediaStatus failure");
// Disable controls and handle failure
}
}
});
//-----------RESOLUTION END EDIT------------------
}else{
Log.d("----Chromecast----", "in status failed");
}
}
}
If I declare the RemoteMediaPlayer as static:
private static RemoteMediaPlayer mRemoteMediaPlayer;
I can join the existing session as well as control the media using commands like:
mRemoteMediaPlayer.play(mApiClient);
or
mRemoteMediaPlayer.pause(mApiClient);
But once I quit the application obviously the static object is destroyed and the app produces the aforementioned "No current media session" exception. I am definitely missing something because after I join the session and register the callback perhaps I need to start the session just like it was creating when I initially loaded the media using mRemoteMediaPlayer.load(.
Can someone please help as this is very frustrating?
The media session ID is part of the internal state of the RemoteMediaPlayer object. Whenever the receiver state changes, it sends updated state information to the sender, which then causes the internal state of the RemoteMediaPlayer object to get updated.
If you disconnect from the application, then this state inside the RemoteMediaPlayer will be cleared.
When you re-establish the connection to the (still running) receiver application, you need to call RemoteMediaPlayer.requestStatus() and wait for the OnStatusUpdatedListener.onStatusUpdated() callback. This will fetch the current media status (including the current session ID) from the receiver and update the internal state of the RemoteMediaPlayer object accordingly. Once this is done, if RemoteMediaPlayer.getMediaStatus() returns non-null, then it means that there is an active media session that you can control.
As user3408864 pointed out, requestStatus() after rejoining the session works. Here is how i managed to solve it in my case and it should work in yours.
if(MAIN_ACTIVITY.isConnected()){
if(MAIN_ACTIVITY.mRemoteMediaPlayer == null){
MAIN_ACTIVITY.setRemoteMediaPlayer();
}
MAIN_ACTIVITY.mRemoteMediaPlayer.requestStatus(MAIN_ACTIVITY.mApiClient).setResultCallback( new ResultCallback<RemoteMediaPlayer.MediaChannelResult>() {
#Override
public void onResult(RemoteMediaPlayer.MediaChannelResult mediaChannelResult) {
if(playToggle ==0){
try {
MAIN_ACTIVITY.mRemoteMediaPlayer.pause(MAIN_ACTIVITY.mApiClient);
playToggle =1;
} catch (IOException e) {
e.printStackTrace();
}
}else{
try {
MAIN_ACTIVITY.mRemoteMediaPlayer.play(MAIN_ACTIVITY.mApiClient);
playToggle =0;
} catch (IOException e) {
e.printStackTrace();
}
}
}
});
}
Ignore, MAIN_ACTIVITY, it is just a static reference to my activity since i run this piece of code from a Service. Also, setRemoteMediaPlayer() is a method where i create a new RemoteMediaPlayer() and attach the corresponding Listeners.
Hopefully this helps. Also, sorry if any mistake, it is my first post to StackOverFlow.
I'm testing playing online video using chromecast.
After onRouteSelected(), I create the ApplicationSession and attach a MediaProtocalMessageStream;
Then I called mSession.startSession(); with no APP_ID, so I assume the build-in app inside chromecast play the video for me. This code works perfect and I can play online mp4 videos without writing my own receiver.
But, When I try to leave the video play app, I can't go back anymore, there is always an error message comes from onSessionStartFailed() which says
StartSessionTask failed with error: failed to start application: no
application is running
I don't remember how the first time I got into the video play app, which I don't leave for few day.
But I do know how I leave it, Here is what I did before I can never startSession again:
open Youtube app, get a deviced connected
play some youtube videos
disconnected from a chormecast, then the chromecast return to the starting page
So, doesn't anybody know what's going on here? How to open the build-in video app again?
By the way, My chromecast get a system update just after I return to the starting page, I don't know if google update something cause startSession() fail.
Below is the code I startSession and attach a mediaStream.
mSession = new ApplicationSession(mCastContext, mSelectedDevice);
ApplicationSession.Listener listener = new ApplicationSession.Listener() {
#Override
public void onSessionStarted(ApplicationMetadata appMetadata) {
mChannel = mSession.getChannel();
mStream = new MediaProtocolMessageStream();
mChannel.attachMessageStream(mStream);
if (mStream.getPlayerState() == null) {
ContentMetadata metaData = new ContentMetadata();
metaData.setTitle("Test Video");
String url = "http://www.auby.no/files/video_tests/h264_720p_hp_5.1_6mbps_ac3_planet.mp4";
try {
mCommand = mStream.loadMedia(url, metaData, true);
mCommand.setListener(new MediaProtocolCommand.Listener() {
#Override
public void onCompleted(MediaProtocolCommand arg0) {
onSetVolume(0.5);
}
#Override
public void onCancelled(MediaProtocolCommand arg0) {
}
});
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}
#Override
public void onSessionStartFailed(SessionError error) {
Log.d("TEST", "Session Started failed");
}
#Override
public void onSessionEnded(SessionError error) {
Log.d("TEST", "Session Started end");
}
};
mSession.setListener(listener);
try {
mSession.startSession();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
You will have to use your own app id and own receiver. Google's default receiver doesn't play video streams anymore (it used to). It only handles Chrome tab mirroring now.
I know it is possible as the camera app that comes with my droid phone does it, but for the life of me, I don't seem to be able to switch cameras on the fly either for video or a standard camera (which leads me to suspect I'm not doing it right!)
Currently, I have an event for the button
btnSwitchCamera.Click += new EventHandler(btnSwitchCamera_Click);
prior to that, I check for the number of cameras - if there is only one camera, the event is not enabled.
The switch code looks like this
private void btnSwitchCamera_Click(object s, EventArgs e)
{
if (isBackCamera == false)
{
try
{
RunOnUiThread(delegate
{
camera.Release();
camera = Android.Hardware.Camera.Open(1);
});
}
catch (Java.Lang.RuntimeException)
{
alertMsg(context, Application.Context.Resources.GetString(Resource.String.videoErrorTitle),
Application.Context.Resources.GetString(Resource.String.videoFailToConnect));
return;
}
isBackCamera = true;
}
else
{
try
{
RunOnUiThread(delegate
{
camera.Release();
camera = Android.Hardware.Camera.Open(0);
});
}
catch (Java.Lang.RuntimeException)
{
alertMsg(context, Application.Context.Resources.GetString(Resource.String.videoErrorTitle),
Application.Context.Resources.GetString(Resource.String.videoFailToConnect));
return;
}
isBackCamera = false;
}
}
If I click the button, the app dies claiming that I cannot connect to the service.
The video record code is nothing special - it's a bog standard set the surface, do the holder, and start/stop recording.
Am I doing this right? From the docs, I need to release the camera then open the camera with the appropriate camera number (Android.Hardware.Camera.NumberOfCameras - 1)
The manifest is correctly set.
Thanks
Paul