Android native webrtc: add video after already connected - android

I have successfully been running WebRTC in my Android app for a while, using libjingle.so and PeerConnectionClient.java, etc., from Google's code library. However, I am now running into a problem where a user starts a connection as audio only (i.e., an audio call), but then toggles video on. I augmented the existing setVideoEnabled() in PeerConnectionClient as such:
public void setVideoEnabled(final boolean enable) {
executor.execute(new Runnable() {
#Override
public void run() {
renderVideo = enable;
if (localVideoTrack != null) {
localVideoTrack.setEnabled(renderVideo);
} else {
if (renderVideo) {
//AC: create a video track
String cameraDeviceName = VideoCapturerAndroid.getDeviceName(0);
String frontCameraDeviceName =
VideoCapturerAndroid.getNameOfFrontFacingDevice();
if (numberOfCameras > 1 && frontCameraDeviceName != null) {
cameraDeviceName = frontCameraDeviceName;
}
Log.i(TAG, "Opening camera: " + cameraDeviceName);
videoCapturer = VideoCapturerAndroid.create(cameraDeviceName);
if (createVideoTrack(videoCapturer) != null) {
mediaStream.addTrack(localVideoTrack);
localVideoTrack.setEnabled(renderVideo);
peerConnection.addStream(mediaStream);
} else {
Log.d(TAG, "Local video track is still null");
}
} else {
Log.d(TAG, "Local video track is null");
}
}
if (remoteVideoTrack != null) {
remoteVideoTrack.setEnabled(renderVideo);
} else {
Log.d(TAG,"Remote video track is null");
}
}
});
}
This allows me successfully see a local inset of the device's video camera, but it doesn't send the video to the remove client. I thought the peerConnection.addStream() call would do that, but perhaps I am missing something else?

To avoid building an external mechanism of communication between peers that will involve an answer from the second peer that the new stream can be added, you can always start with existing (but sometimes empty) video stream. Now it is just the matter of filling this stream with content when (and if) necessary.

Related

Microblink recognizer set up RegexParserSettings

I am trying to scan an image taken from resources using a Recognizer with a RegerParserSettings inside a fragment. The problem is that BaseRecognitionResult obtained through the callback onScanningDone is always null. I have tried to set up the RecognitionSettings with MRTDRecognizer and worked fine, so I think that the library is properly integrated. This is the source code that I am using:
#Override
public void onAttach(Context context) {
...
try {
mRecognizer = Recognizer.getSingletonInstance();
mRecognizer.setLicenseKey(context, LICENSE_KEY);
} catch (FeatureNotSupportedException | InvalidLicenceKeyException e) {
Log.d(TAG, e.getMessage());
}
buildRecognitionSettings();
mRecognizer.initialize(context, mRecognitionSettings, new DirectApiErrorListener() {
#Override
public void onRecognizerError(Throwable t) {
//Handle exception
}
});
}
private void buildRecognitionSettings() {
mRecognitionSettings = new RecognitionSettings();
mRecognitionSettings.setRecognizerSettingsArray(setupSettingsArray());
}
private RecognizerSettings[] setupSettingsArray() {
RegexParserSettings regexParserSettings = new RegexParserSettings("[A-Z0-9]{17}");
BlinkOCRRecognizerSettings sett = new BlinkOCRRecognizerSettings();
sett.addParser("myRegexParser", regexParserSettings);
return new RecognizerSettings[] { sett };
}
I scan the image like:
mRecognizer.recognizeBitmap(bitmap, Orientation.ORIENTATION_PORTRAIT, FragMicoblink.this);
And this is the callback handled in the fragment
#Override
public void onScanningDone(RecognitionResults results) {
BaseRecognitionResult[] dataArray = results.getRecognitionResults();
//dataArray is null
for(BaseRecognitionResult baseResult : dataArray) {
if (baseResult instanceof BlinkOCRRecognitionResult) {
BlinkOCRRecognitionResult result = (BlinkOCRRecognitionResult) baseResult;
if (result.isValid() && !result.isEmpty()) {
String parsedAmount = result.getParsedResult("myRegexParser");
if (parsedAmount != null && !parsedAmount.isEmpty()) {
Log.d(TAG, "Result: " + parsedAmount);
}
}
}
}
}`
Thanks in advance!
Helllo Spirrow.
The difference between your code and SegmentScanActivity is that your code uses DirectAPI, which can process only single bitmap image you send for processing, while SegmentScanActivity processes camera frames as they arrive from the camera. While doing so, it can utilize time redundant information to improve the OCR quality, i.e. it combines consecutive OCR results from multiple video frames to obtain a better quality OCR result.
This feature is not available via DirectAPI - you need to use either SegmentScanActivity, or custom scan activity with our camera management.
You can also find out more here:
https://github.com/BlinkID/blinkid-android/issues/54
Regards

Client's objects movement not syncing across the Network

I'm making a 2-player android game using UNET. Now, all the movements of the host's objects are syncing across the network therefore it's working fine. But when the object on the client's side moves, it moves but it doesn't move in the host's screen. Therefore, the movement is not syncing.
I already attached NetworkIdentity, NetworkTransform, and PlayerController script to it. As well as the box collider (for the raycast).
The server and the client has the same script in the PlayerController but the only difference is, host could only move objects with Player tags and objects with Tagger tags for the client.
void Update () {
if(!isLocalPlayer){
return;
}
if(isServer){
Debug.Log("Server here.");
if (Input.GetMouseButtonDown(0))
{
Vector2 cubeRay = Camera.main.ScreenToWorldPoint(Input.mousePosition);
RaycastHit2D cubeHit = Physics2D.Raycast(cubeRay, Vector2.zero);
if (cubeHit)
{
if(cubeHit.transform.tag=="Player")
{
if (this.target != null)
{
SelectMove sm = this.target.GetComponent<SelectMove>();
if (sm != null) { sm.enabled = false; }
}
target = cubeHit.transform.gameObject;
selectedPlayer();
}
}
}
}
if(!isServer){
Debug.Log("Client here.");
if (Input.GetMouseButtonDown(0))
{
Vector2 cubeRay = Camera.main.ScreenToWorldPoint(Input.mousePosition);
RaycastHit2D cubeHit = Physics2D.Raycast(cubeRay, Vector2.zero);
if (cubeHit)
{
if(cubeHit.transform.tag=="Tagger")
{
if (this.target != null)
{
SelectMove sm = this.target.GetComponent<SelectMove>();
if (sm != null) { sm.enabled = false; }
}
target = cubeHit.transform.gameObject;
selectedPlayer();
}
}
}
}
}
I'm using (!isServer) to identify client because isClient sometimes doesn't work fine on my project. I also tried using it again to test it out, but still no luck.
You dont need to use tags to move players this one script PlayerController is enough using only isLocalPlayer check and try disabling this script using !isLocalPlayer on both clients. Use this for reference http://docs.unity3d.com/Manual/UNetSetup.html and check thier sample tutorial

stream screen to chromecast using Presentation

I'm using com.android.support:appcompat-v7:21.0.3 to develop an app that stream the screen to the TV using chromecast.
The problem is that when i retrieve the presentationDisplay it is null!
I'm using default receiver app and it seems that chromecast does not support
MediaControlIntent.CATEGORY_LIVE_VIDEO
This is the code:
private void updatePresentation() {
Log.d(TAG, "updatePresentation()");
MediaRouter.RouteInfo route = mMediaRouter.getSelectedRoute();
Display presentationDisplay = route != null ? route.getPresentationDisplay() : null;
Log.d(TAG, "MediaRouter.RouteIngo: " + route.getName());
if (presentationDisplay != null)
Log.d(TAG, "presentationDisplay " + presentationDisplay.getName());
else if (presentationDisplay == null)
Log.d(TAG, "presentationDisplay is null");
// Dismiss the current presentation if the display has changed.
if (mPresentation != null && mPresentation.getDisplay() != presentationDisplay) {
Log.i(TAG, "Dismissing presentation because the current route no longer "
+ "has a presentation display.");
mPresentation.dismiss();
mPresentation = null;
}
// Show a new presentation if needed.
if (mPresentation == null && presentationDisplay != null) {
Log.i(TAG, "Showing presentation on display: " + presentationDisplay);
mPresentation = new DemoPresentation(this, presentationDisplay);
mPresentation.setOnDismissListener(new DialogInterface.OnDismissListener() {
#Override
public void onDismiss(DialogInterface dialog)
{
if (mPresentation != null) mPresentation.dismiss();
}
});
try {
Log.d("mPresentation", "showing");
mPresentation.show();
} catch (WindowManager.InvalidDisplayException ex) {
Log.w(TAG, "Couldn't show presentation! Display was removed in "
+ "the meantime.", ex);
mPresentation = null;
}
}
}
on my own nexus 10 i used Chromecast App to setup the chromecast device and on my nexsus 4 all happened automatically.
No, I am not referring to just the Chromecast app.
Chromecast natively supports the Cast SDK and RemotePlaybackClient. It also supports serving as an external display, which can be used for Presentation. However, the user has to manually go into Settings > Displays > Cast Screen and choose the Chromecast. Then, you will get screen mirroring, and Presentation will work.

Connecting to existing Google Chromecast Session from Android (for generic remote control)

I am creating a generic Chromecast remote control app. Most of the guts of the app are already created and I've managed to get Chromecast volume control working (by connecting to a Chromecast device along side another app that is casting - YouTube for example).
What I've having difficult with is performing other media commands such as play, pause, seek, etc.
Use case example:
1. User opens YouTube on their android device and starts casting a video.
2. User opens my app and connects to the same Chromecast device.
3. Volume control from my app (works now)
4. Media control (play, pause, etc) (does not yet work)
I found the Cast api reference that explains that you can sendMessage(ApiClient, namespace, message) with media commands; however the "message" (JSON) requires the sessionId of the current application (Youtube in this case). I have tried the following, but the connection to the current application always fails; status.isSuccess() is always false:
Cast.CastApi
.joinApplication(mApiClient)
.setResultCallback(
new ResultCallback<Cast.ApplicationConnectionResult>() {
#Override
public void onResult(
Cast.ApplicationConnectionResult result) {
Status status = result.getStatus();
if (status.isSuccess()) {
ApplicationMetadata applicationMetadata = result
.getApplicationMetadata();
sessionId = result.getSessionId();
String applicationStatus = result
.getApplicationStatus();
boolean wasLaunched = result
.getWasLaunched();
Log.i(TAG,
"Joined Application with sessionId: "
+ sessionId
+ " Application Status: "
+ applicationStatus);
} else {
// teardown();
Log.e(TAG,
"Could not join application: "
+ status.toString());
}
}
});
Is is possible to get the sessionId of an already running cast application from a generic remote control app (like the one I am creating)? If so, am I right in my assumption that I can then perform media commands on the connected Chromecast device using something like this:
JSONObject message = new JSONObject();
message.put("mediaSessionId", sessionId);
message.put("requestId", 9999);
message.put("type", "PAUSE");
Cast.CastApi.sendMessage(mApiClient,
"urn:x-cast:com.google.cast.media", message.toString());
Update:
I have tried the recommendations provided by #Ali Naddaf but unfortunately they are not working. After creating mRemoteMediaPlayer in onCreate, I also do requestStatus(mApiClient) in the onConnected callback (in the ConnectionCallbacks). When I try to .play(mApiClient) I get an IllegalStateException stating that there is no current media session. Also, I tried doing joinApplication and in the callback performed result.getSessionId; which returns null.
A few comments and answers:
You can get the sessionId from the callback of launchApplication or joinApplication; in the "onResult(result)", you can get the sessionId from: result.getSessionId()
YouTube is still not on the official SDK so YMMV, for apps using official SDK, you should be able to use the above approach (most of it)
Why are you trying to set up a message yourself? Why not building a RemoteMediaPlayer and using play/pause that is provided there? Whenever you are working with the media playback through the official channel, always use the RemoteMediaPlayer (don't forget to call requestStatus() on it after creating it).
Yes it is possible , First you have to save sesionId and CastDevice device id
and when remove app from background and again open app please check is there sessionId then call bello line.
Cast.CastApi.joinApplication(apiClient, APP_ID,sid).setResultCallback(connectionResultCallback);
if you get success result then need to implement further process in connectionResultCallback listener.
//Get selected device which you selected before
#Override
public void onRouteAdded(MediaRouter router, MediaRouter.RouteInfo route) {
// Log.d("Route Added", "onRouteAdded");
/* if (router.getRoutes().size() > 1)
Toast.makeText(homeScreenActivity, "'onRouteAdded :: " + router.getRoutes().size() + " -- " + router.getRoutes().get(1).isSelected(), Toast.LENGTH_SHORT).show();
else
Toast.makeText(homeScreenActivity, "'onRouteAdded :: " + router.getRoutes(), Toast.LENGTH_SHORT).show();*/
if (router != null && router.getRoutes() != null && router.getRoutes().size() > 1) {
// Show the button when a device is discovered.
// Toast.makeText(homeScreenActivity, "'onRouteAdded :: " + router.getRoutes().size() + " -- " + router.getRoutes().get(1).isSelected(), Toast.LENGTH_SHORT).show();
mMediaRouteButton.setVisibility(View.VISIBLE);
titleLayout.setVisibility(View.GONE);
castName.setVisibility(View.VISIBLE);
selectedDevice = CastDevice.getFromBundle(route.getExtras());
routeInfoArrayList = router.getRoutes();
titleLayout.setVisibility(View.GONE);
if (!isCastConnected) {
String deid = MyPref.getInstance(homeScreenActivity).readPrefs(MyPref.CAST_DEVICE_ID);
for (int i = 0; i < routeInfoArrayList.size(); i++) {
if (routeInfoArrayList.get(i).getExtras() != null && CastDevice.getFromBundle(routeInfoArrayList.get(i).getExtras()).getDeviceId().equalsIgnoreCase(deid)) {
selectedDevice = CastDevice.getFromBundle(routeInfoArrayList.get(i).getExtras());
routeInfoArrayList.get(i).select();
ReSelectedDevice(selectedDevice, routeInfoArrayList.get(i).getName());
break;
}
}
}
}
}
//Reconnect google Api Client
public void reConnectGoogleApiClient() {
if (apiClient == null) {
Cast.CastOptions apiOptions = new
Cast.CastOptions.Builder(selectedDevice, castClientListener).build();
apiClient = new GoogleApiClient.Builder(this)
.addApi(Cast.API, apiOptions)
.addConnectionCallbacks(reconnectionCallback)
.addOnConnectionFailedListener(connectionFailedListener)
.build();
apiClient.connect();
}
}
// join Application
private final GoogleApiClient.ConnectionCallbacks reconnectionCallback = new GoogleApiClient.ConnectionCallbacks() {
#Override
public void onConnected(Bundle bundle) {
// Toast.makeText(homeScreenActivity, "" + isDeviceSelected(), Toast.LENGTH_SHORT).show();
try {
String sid = MyPref.getInstance(homeScreenActivity).readPrefs(MyPref.CAST_SESSION_ID);
String deid = MyPref.getInstance(homeScreenActivity).readPrefs(MyPref.CAST_DEVICE_ID);
if (sid != null && deid != null && sid.length() > 0 && deid.length() > 0)
Cast.CastApi.joinApplication(apiClient, APP_ID, sid).setResultCallback(connectionResultCallback);
isApiConnected = true;
} catch (Exception e) {
}
}
#Override
public void onConnectionSuspended(int i) {
isCastConnected = false;
isApiConnected = false;
}
};

Android Player Error (-38,0)

i made a stream program to play an ad + audio + ad. i play first ad fine , then i switch to the audio which fine then i fail at playing the last ad and i get Error(38,0). i checked that i have set data source,onPrepareListener and i tried every thing i can found so far but still getting this error on android 4.1.1
I get error after my method MPStarting , i do not even reach the onPrepared method only for final ad.if there is any info u need more plz let me know thanks.
here is the part of code which is related
MPStarting(Track)
{ try
{
if (_playlist !=null && _playlist.GetCurrent() != null)
{
Episode ep = (Episode) _playlist.GetCurrent();
_player = new MediaPlayer();
AdsInfo startAd = ep.getAdWithType(PlayTime.start_ad);
AdsInfo endAd = ep.getAdWithType(PlayTime.end_ad);
if(currAudio == null && startAd != null)
currAudio = startAd;
else if(currAudio == startAd )
currAudio = ep;
else if (currAudio instanceof Episode && endAd != null)
currAudio = ep.getAdWithType(PlayTime.end_ad);
}
if(_player != null)
{
_player.setDataSource(dataSource);
_player.setOnPreparedListener(this);
_player.setOnCompletionListener(this);
_player.setOnBufferingUpdateListener(this);
_player.setOnSeekCompleteListener(this);
_player.setOnErrorListener(this);
_player.prepareAsync();
}
catch (Exception e)
{
Log.i("mpcPlayer","MPStarting "+ e.getLocalizedMessage());
}
}
}
#Override
public void onCompletion(MediaPlayer mp)
{
//here i check on current playing
//i always stop player if it is playing ,reset,release and make player = null
// then i call MPStarting and i send the current audio then return
}
I think i found my problem ,i was calling sometimes getCurrentPosition() it seems player was not ready at that time.i guess this error is about calling a method sometimes while player not in right state.

Categories

Resources