VideoPlayerController not working inside inside Existing Android App - android

I am trying to integrate with flutter with the existing Android App. Basic page of flutter is working fine. However when try running video using VideoPlayerController it breaks with below error. When I try same code in pure flutter app it works. Any idea how to fix it.
Error
VideoPlayerApi
exception = {PlatformException} PlatformException(channel-error, Unable to establish connection on channel., null)
replyMap = null
VideoPlayerController.network(
'https://flutter.github.io/assets-for-api-docs/assets/videos/bee.mp4')
..initialize().then((_) {
setState(() {
_controller.play();
_controller.setLooping(true);
});
});

Related

Noticing different behavior when updating playback status on iOS and Android using Expo's Audio Module

I'm currently attempting to implement an audio player in React Native using Expo's Audio module. Everything is working as expected in Android, but on iOS, the playbackStatusUpdate() callback does not appear to be working. I've pasted the relevant snippet below.
Audio.setAudioModeAsync({
playsInSilentModeIOS: true,
interruptionModeIOS: INTERRUPTION_MODE_IOS_DO_NOT_MIX,
interruptionModeAndroid: INTERRUPTION_MODE_ANDROID_DO_NOT_MIX,
});
const { sound } = await Audio.Sound.createAsync({ uri });
await sound.setProgressUpdateIntervalAsync(100);
sound.setOnPlaybackStatusUpdate((data) => {
if (data.isLoaded) {
setCurrentPosition(data.positionMillis);
setIsPlaying(data.isPlaying);
setClipDuration(data.durationMillis || 0);
if (data.didJustFinish && !data.isLooping) {
sound.setPositionAsync(0);
options.onPlaybackComplete();
}
}
});
On iOS, the callback simply is not called at all, so no statuses, etc. are set.
I'm using the current position to update a slider at a higher level in my component tree. Again - this all works exactly as expected in Android. Any idea what could be happening here?

How to create a WebXR-AR app made by Cordova( IOS and Android)

For the installation of the application, I set up xcode for ios and android studio for android. Unfortunately, augmented reality cannot appear in the application, i guess main issue is coming from this part of code:
if ( 'xr' in navigator ) {
navigator.xr.isSessionSupported( 'immersive-ar' ).then( ( supported ) => {
if (supported){
const collection = document.getElementsByClassName("ar-button");
[...collection].forEach( el => {
el.style.display = 'block';
});
}
} );
}
else{
alert("not supported");
}
When I try to start the session with the code I wrote above. It's not working because i believe xr not supported.I tried cordova-webxr-plugin as well. However, it didn't work.
Does anyone have any idea about my problem or any related suggestion to that?

Show a Camera Activity which runs tflite on image frames in a flutter app

I have an android app having a CameraActivity which runs a tflite classifier periodically on image frames from the preview stream. The implementation of the Camera and tflite works great in the Android part and gives a good FPS.
I want to show this CameraActivity in my Flutter App as a Screen. The Flutter app has all the Frontend and UI part implemented already.
I've already tried using the official Flutter Camera plugin to implement the same by using camera.startImageStream but was unable to get matching FPS and the Camera Preview lags when calling the tflite model asynchronously using methodChannel.
I also came across AndroidView which embeds an Android view in the Widget hierarchy but the docs say it is an expensive operation and should be avoided when a Flutter equivalent is possible.
Is there a way to write a plugin for showing the CameraActivity (i.e. the UI) on the Flutter end similar to the way methodChannel is used for exchanging data between the Flutter and the Native codes. Or if there's another possible way of achieving this, please let me know.
Thanks in advance!
//Flutter side
Future<Null> showNativeView() async {
if (Platform.isAndroid) {//check if platform is android
var methodChannel = MethodChannel("methodchannelname");//create a method channel name
await methodChannel.invokeMethod('showNativeCameraView');//create method name
}
}
Container(
child: InkWell(
onTap: () {
showNativeView();//call to open native android activity
},
child: Center(
child: new Text("takeimage"))))
//Android side
//inside mainactivity on create
var channel: MethodChannel? = null
channel = MethodChannel(flutterView, "methodchannelname");
MethodChannel(flutterView, "methodchannelname")//same as flutterside
.setMethodCallHandler { call, result ->
if (call.method == "showNativeCameraView") {//same methodname as flutterside
val intent = Intent(this, ActivityCamera::class.java)//native camera activity code can be added in the android folder of the flutter application
startActivity(intent)
result.success(true)
} else {
result.notImplemented()
}
}
This link will help u further
https://medium.com/#Chetan/flutter-communicate-with-android-activity-or-ios-viewcontroller-through-method-channel-c11704429cd0

Direct call without pressing green button flutter in ios build

I m using call_number plugin for direct call in flutter but the problem is that this is only working for Android not for iOS how I can build this functionality in iOS build.
Create your calling function as;
_call() async {
if(numberToCall != null)
await CallNumber().callNumber(numberToCall);
}
example usage:
IconButton(onPressed: _call);

How to fix 'backgroundColor' error on app load in expo

I am using expo v32. My app was running fine, next day when i run my expo app it won't launch, it is giving me the error related to 'backgroundColor', but not showing the location where it occurs.
I am trying to debug the error, but unable to do so. When i enable the 'Remotely debug JS' option the will crash after 100% loading.
I was trying to debug code with console log, while debugging i found that code will run fine before the Font.loadasync function call, but after the function call the promise will not resolved and also doesn't throw any exception in try catch block.
async componentDidMount() {
try {
console.log("before font") // this will execute
await Font.loadAsync({
Roboto: require('native-base/Fonts/Roboto.ttf'),
Roboto_medium: require('native-base/Fonts/Roboto_medium.ttf')
});
console.log("after font") // this will not execute
} catch(e) {
console.log("error", e) // this will not execute
}
this.setState({ fontLoaded: true })
}
App should launch the page, but App is throwing error of some 'backgroundColor' and also the app should console the "after font" or "error" message but it didn't console any thing after "before font".Error image of expo client app
Looks like an issue with the latest expo client. Try looking for anywhere you're using backgroundColor with any sort of transparency and remove the transparency.
https://forums.expo.io/t/error-with-last-expo-client-and-sdk32/23334

Categories

Resources