MVVMCROSS Xamarin.Android 6.1.2 Library Navigation - android

could you help me to understand is it possible make navigation in platform library layer in last MvvmCross version.
My solution has the next structure:
Core Layer
Library Layer (Android Library)
WL (White Lable) Layer (a bunch of android apps)
All my necessary code for android apps placed in Library Layer, in WL layer I just change some resources and images.
Earlier I used MvvmCross 5.1.1 and custom presenter works fine for Me, but in new one MvvmCross 6.1.2 with default presenter doesn't, couldn't find View for ViewModel.
If i Move Activity from Library Layer in to any app in WL Layer it works fine.
[MvxActivityPresentation] doesn't work in Libraries project ???

In your Setup.cs you need to override your GetViewAssemblies and add the assembly where your Activity is:
public override IEnumerable<Assembly> GetViewAssemblies()
{
var viewsAssemblies = new List<Assembly>(base.GetViewAssemblies());
viewsAssemblies.Add(typeof(MyActivity).Assembly);
return viewsAssemblies;
}
Doing this you ensure that that assembly will be taken into account to find the View corresponding to your ViewModel
More info in Providing additional View and ViewModel Assemblies
HIH

Related

Loading glb model in react native three using expo-three component not working on android device. The model appears on the web version though

I'm trying to create a react-native apps with three js using expo-gl, expo-three frameworks.
Following is the list of imports..
import { ExpoWebGLRenderingContext, GLView } from 'expo-gl';
import ExpoTHREE, { Renderer, TextureLoader } from 'expo-three';
import { GLTFLoader } from 'three/examples/jsm/loaders/GLTFLoader';
import * as React from 'react';
import {
AmbientLight,
HemisphereLight,
BoxBufferGeometry,
Fog,
GridHelper,
Mesh,
MeshStandardMaterial,
PerspectiveCamera,
PointLight,
Scene,
SpotLight,
Camera,
InstancedMesh,
} from 'three';
import OrbitControlsView from 'expo-three-orbit-controls';
Apart from the basic scene, camera and light setup I'm trying to load a glb model using the ExpoTHREE.loadAsync method as below...
const loadGlb = async ()=>{
const obj = await ExpoTHREE.loadAsync(
[require('./assets/suzanne.glb')],
null,
null}).
then((e)=>{
scene.add(e.scene);
e.scene.traverse((f)=>{if(f.isMesh){f.material = new THREE.MeshNormalMaterial();}});
})
.catch((err)=>{console.log(err)});
}
loadGlb();
Using the ref: https://www.npmjs.com/package/expo-three
The model loads when I run the code on the desktop browser but not on my android phone using expo app. Please let me know what am I doing wrong.
You can access the app here https://expo.io/#praful124/expo3
I had the same problem and ended up running into the fact that it does not support textures inside the glb file. But if you find a solution, I will be very happy if you share them.

Xamarin Forms and Prism to native view - Navigating from a view in prism project to a view in shared project

My Project:
MyAppB (External Shared Project)
includes multiple views (xaml files) which extends from ContentPage
uses e.g. await Navigation.PushAsync(new Chat());
MyApplicationA (Code Sharing - .NET Standard 2.0)
is build with prism framework (Model-View-ViewModel)
uses e.g. await NavigationService.NavigateAsync("name", parameters);
MyApplicationA.Android
references both (MyApplicationA and MyAppB) -> Is this correct?
MyApplicationA.iOS
references both (MyApplicationA and MyAppB) -> Is this correct?
My Goal:
I want to navigate from a view or viewmodel in MyApplicationA to a view in MyAppB.
Fortunatly navigating to a view in MyAppB from a viewmodel in MyApplicationA is working.
I'm using a interface definition in MyApplicationA and calls it in the viewmodel.
The interface is implemented in the MyApplicationA.Android and MyApplicationA.iOS.
WORKS: To navigate i use: App.Current.MainPage = new NavigationPage(new MainPage(screenCaptureIntent)); // (the intent is the reason why i have to navigate from MyApplicationA to MyApplicationA.Android and then to MyAppB)
DONT WORK: Using await App.Current.MainPage.Navigation.PushAsync(new MainPage(screenCaptureIntent)); is not working and gives the following error:
PushAsync is not supported globally on Android, please use a NavigationPage.
But i have problems to go back from the view in MyAppB to the application in MyApplicationA e.g. a view which loads data, in best case it would be the last page in the navigation stack...
Does someone have a hint for me how to solve this? Am I doing it the right way? Or can i reference MyAppB in MyApplicationA - but what happens to the device specific code e.g. #if __IOS__
Thx
Daniel

Xamarin crashes on Android when creating custom renderer

I want to implement a custom entry in xamarin, followed some youtube tutorials step by step and it works on them but mine crashes when launching it via a live player.
Here is the code in the shared project
using Xamarin.Forms;
namespace QuickTest.CustomControls
{
public class PlainEntry : Entry
{
}
}
And here is the android specific version
using Android.Content;
using QuickTest.CustomControls;
using QuickTest.Droid.CustomAndroidControls;
using Xamarin.Forms;
using Xamarin.Forms.Platform.Android;
[assembly: ExportRenderer(typeof(PlainEntry), typeof(PlainEntryAndroid))]
namespace QuickTest.Droid.CustomAndroidControls
{
public class PlainEntryAndroid : EntryRenderer
{
public PlainEntryAndroid() : base(null) { }
public PlainEntryAndroid(Context context) : base(context) { }
}
}
Its a basic implementation i commented out the OnElementChanged function just so i could get it to run first, is there something am doing wrong, any help would be highly appreciated because i have already wasted enough time on this, thanks.
Btw I have also tried it without either of the constructors and it failed.
Xamarin crashes on Android when creating custom renderer. it didn't give out any error, just xamarin player was crashing
Please refer to the documentation: Limitations of Xamarin Live Player
For Xamarin.Forms:
Custom Renderers are not supported.
Effects are not supported.
Custom Controls with Custom Bindable Properties are not supported.
Embedded resources are not supported (ie. embedding images or other resources in a PCL).
Third party MVVM frameworks are not supported (ie. Prism, Mvvm Cross, Mvvm Light, etc.).
That's why the issue happened. So, it is suggested that deploy your project to an android emulator or a real device.

Using UserDialogs in Android

I'm stucked with usage of plugin Acr.UserDialogs in android app, based on MVVMCross.
In PCL project i used IUserDialog in viewmodel constructor injection.
I have installed Acr.UserDialogs package both in PCL and in Droid project, but when i run app, it throws:
In android, you must call UserDialogs.Init(Activity) from your first
activity OR UserDialogs.Init(App) from your custom application OR
provide a factory function to get the current top activity via
UserDialogs.Init(() => supply top activity)
I tryed to call in my viewModel:
UserDialogs.Init(this);
But Init is not recognized
And calling of UserDialogs.Instance.Loading ().Hide(); in app throws the same issue.
How it should be initialized in android project?
Upd: Final solution to workaround this looks like:
In PCL project App.cs add: Mvx.RegisterSingleton(() =>
UserDialogs.Instance);
In Your first loaded activity in OnCreate
add: UserDialogs.Init(() => this);
This error is very clearly. You can't initialize it in viewModel, You can only do that in your main activity.
FAQ
I'm getting a nullreferenceexception when using loading.
This happens when you run loading (or almost any dialog) from the
constructor of your page or viewmodel. The view hasn't been rendered
yet, therefore there is nothing to render to.
Android Initialization In your MainActivity
UserDialogs.Init(this);
OR UserDialogs.Init(() => provide your own top level activity provider)
OR MvvmCross - UserDialogs.Init(() => Mvx.Resolve<IMvxTopActivity>().Activity)
OR Xamarin.Forms - UserDialogs.Init(() => (Activity)Forms.Context)
GitHub docs.

Same Titanium code base for Android & Iphone

I have been trying to create a single codebase for both Iphone & Android for a intermediate level app. ( 4 tabs, multiple windows, maps etc.) using itanium 2.1 API.
However, I have found that things on Android platform dont work as smoothly or willingly as on Iphone epsecially tableviews & UI elemnts. The UI responsiveness on Android is also sluggish.
The kitchen sink examples are pretty straightforward. I am looking at an enterprise ready app which has to be maintained for atleast next couple of years.
Has anybody worked on similar lines with platform quirks and been successful in creating fully functional iOS & Android apps from SAME codebase?
I'm having a lot of success using the compile-time CommonJS mechanism for having a root view that then has os-specific capabilities.
For instance, my os-independent view might be ui/MyView.js :
var createAddButton = require("ui/MyView.AddButton");
var MyView = function() {
var self = Ti.UI.createWindow();
createAddButton(self, function() { alert('ADD!'); });
return self;
};
module.exports = MyView;
Then, I create os-specific functions to handle it:
iphone/ui/MyView.AddButton.js
module.exports = function(view, addHandler) {
var addButton = Titanium.UI.createButton({
systemButton: Titanium.UI.iPhone.SystemButton.ADD
});
addButton.addEventListener("click", addHandler);
view.rightNavButton = addButton;
};
android/ui/MyView.AddButton.js
module.exports = function(view, addHandler) {
view.activity.onCreateOptionsMenu = function(e){
var menuItem = e.menu.add({ title: "Add" });
menuItem.addEventListener("click", addHandler);
};
};
The CommonJS system they have implemented will pick the appropriate version of MyView.AddButton.js so that the button is added to the right place. It allows for the majority of the view to be the same, but the os-specific things to be separated properly.
Titanium is not meant for 1 codebase for all. You do need to rewrite stuff for every OS. However, some app developers claim to have reused 95% of its code. So only 5% of the code is OS specific. But I am sure their code is full with if-elses.
What I recommend doing, to be able to maintain it properly, without thousands of if-else constructions, is build a single backend core, and write code specifically for UI related matters per OS. This way, you have some UI related code for Android, UI related code for iOS and 1 core working for both.
Since Android and iOS differ a lot, writing a single codebase will make sure you can never use OS specific features (like android hardware menu button, or iOS NavigationGroup), and will let the UI look non-intuitive.

Categories

Resources