I am trying to create a custom FollowMe mission by sending a vehicle's GPS data on Android studio. i can send the vehicle coordinates,but the updateFollowingTarget gives a timeout error.I'm using mavic 2 zoom and dji sdk v1.14 .Did someone manage to fix this issue.
Thanks in advance.
It's a bug. It always returns timeout.
Just dont care about the error and it will work.
But it speed limited to like 15km/h so dont expect to much from it.
Edited (Do you know another function that i can use to follow a vehicle's GPS signal):
Yes, it involves more programming though.
You have to use virtualstick to control the drone. This is the only way to control the drone programmatically.
I have done it here, follows a tracker app running on a phone on my head:
https://www.youtube.com/watch?v=i3axYfIOHTY
Im working on a python api for dji. In that framework the top level code looks like below. The virtualstick calls are inside move_towards():
while True:
tracker_location = api.tracker.get_location()
drone_target_location = copy.deepcopy(tracker_location).move_to(Changeable.radius.get_value(), Changeable.bearing.get_value())
drone_location = api.drone.get_location()
course_to_tracker = drone_location.get_course(tracker_location)
heading = api.drone.smooth_heading(course_to_tracker)
drone_target_distance, drone_speed, course = api.drone.move_towards(drone_target_location, height=current_altitude, heading=heading, lowest_altitude=Changeable.lowest_altitude.get_value(), deadzone=[Changeable.dead_zone0.get_value(), Changeable.dead_zone1.get_value()], k_distance=[float(Changeable.k_dist0.get_value()), float(Changeable.k_dist1.get_value())], speed_deadzone=Changeable.speed_deadzone.get_value())
Related
Im building a react-native application.
Im trying to meter the current sound level (in decibel).
Libraries in use: react-native-audio and react-native-sound.
There is anybody familiar with this feature?
Thank you.
With the react-native-audio library, you can get the count for the only iOS.
For the currentMetering in Android, you need to customize in a native module. I have updated it add the following things in your package.json file instead of your code. You will get the currentMetering count for the Android as well as iOS.
"react-native-audio": "git+https://github.com/Harsh2402/react-native-audio.git",
You can use react-native-audio currentMetering value - in order to get the sound level in real-time.
First, you will have to initialise your recorder (which i will assume youve done). I use prepareRecordingAtPath in a similar way to below
AudioRecorder.prepareRecordingAtPath(audioPath, {
SampleRate: 22050,
Channels: 1,
AudioQuality: "Low",
AudioEncoding: "aac",
MeteringEnabled: true
});
then once you've called AudioRecorder.startRecording(); (Note you have access to .pause() and .stop() methods also
Now for handling the audio level, you are going to have to retrieve the data that is returned by the onProgress method. From what i remember, there should be some currentMetering value that you can access. Note that defining this behaviour will trigger the action every time a different decibel reading is retrieved. Like so
AudioRecorder.onProgress = data => {
let decibels = Math.floor(data.currentMetering);
//DO STUFF
};
Hope this helps,
I'm using Here Mobile Android SDK and try to simulate GPX track, in order to test my map behaviour. I use PositionSimulator class, set callback for position updates, but when I call startPlayback(filename) it parse my gpx files with no errors on android, but not simulate fake gps coordinates.
Can someone provide a workable gpx file for PositionSimulator or workable code sample?
My code:
posManager = PositioningManager.getInstance();
posManager.start(PositioningManager.LocationMethod.GPS_NETWORK);
posManager.addListener(
new WeakReference<PositioningManager.OnPositionChangedListener>(positionListener));
mapFragment.getPositionIndicator().setVisible(true);
simulator = new PositionSimulator();
PositionSimulator.PlaybackError err = simulator.startPlayback(trackFileName);
simulator instance received correct number of points, but my location marker and camera don't move at all. I try different gpx files with routes, waypoints and tracks. Maybe I need to setup a timestamps inside gpx somehow? Permissions for Moc locations added.
Is there any better way to test camera movements, when simulating driving among some route? Any help will be appreciated.
Below you can see example of the gpx, which can be played by PositionSimulator:
<gpx>
<metadata>
<name>london</name>
<time>2017-01-19T17:41:11Z</time>
</metadata>
<trk>
<name>test</name>
<trkseg>
<trkpt lat="51.47785480" lon="-0.14754295">
<ele>8.0000000</ele>
<time>2010-01-01T00:00:00Z</time>
<hdop>33</hdop></trkpt>
<trkpt lat="51.47788554" lon="-0.14778173">
<ele>8.0000000</ele>
<time>2010-01-01T00:00:01Z</time>
<hdop>42</hdop></trkpt>
<trkpt lat="51.47787976" lon="-0.14807005">
<ele>6.0000000</ele>
<time>2010-01-01T00:00:02Z</time>
<hdop>20</hdop></trkpt>
....
</trkseg>
</trk>
</gpx>
The only one thing I did wrong was not providing hdop field - value for accuracy. Even if you turn off the accuracy displayments for PositionIndicator - apparently it will not be played without that hdop value.
AndrewJC, thanks for help.
Have you set your app in the "Settings > Developer Options > Select mock location app" dialog? More info
Also, you can try troubleshooting by using the PositionSimulator getPositionCount, getCurrentPositionIndex and getPosition APIs to see if it seems to be handling your GPX log properly.
You can try creating a GPX log using the HERE SDK by using the PositioningManager setLogType API with LogType#DATA_SOURCE
I'm sending notification to a wearable from a handheld and then displaying background images on cards without text. I'd like to optimize the images for round and square wearables without creating a standalone app and wearable activities.
How would I send a message to the wearable asking what size it is and if it's round or square? I know how to send messages, I'm just looking for the api to look up if it's round or square.
From the following sample, you can see how you would detect this in an activity, but I'd like to detect this in a background service since I don't want to create a standalone wearable app.
https://github.com/mauimauer/AndroidWearable-Samples/blob/8287982332b82cada7bf68a6c5aa88df1bbbcbbe/GridViewPager/Wearable/src/main/java/com/example/android/wearable/gridviewpager/MainActivity.java
My other question shows how to detect if there is a wearable paired, but it only returns node name and node id, no other useful information about the actual wearable.
How to detect if android device is paired with android wear watch
The official way to determine round vs square is to use the WatchInsets class, and the isRound() method. https://developer.android.com/reference/android/view/WindowInsets.html#isRound()
There is a sample named GridViewPager included in the Android SDK Manager for API 20 that shows how to use the isRound() method.
For the rest of your question ... you will need to implement an app that runs on the watch, that would perform this query for you. You can then send a message to the watch, it performs the query, and then send a message back to the phone, for whatever else it is you want to do.
If you look at the DataLayer sample (also in the same place as GridViewPager) it shows how to detect the connection status of the wearable to the phone.
Unofficial way - but for me it was way way easier.
https://github.com/tajchert/ShapeWear
Just copy ShapeWear.java class, and subscribe to screen shape detection event setOnShapeChangeListener() or call method ShapeWear.isRound() (can throw error is shape is not yet determined) or ShapeWear. getShape() - which can result in ShapeWear.SHAPE_UNSURE in same situation.
override this method in ur Engine class that extends CanvasWatchFaceService.Engine
#Override
public void onApplyWindowInsets(WindowInsets insets) {
super.onApplyWindowInsets(insets);
Log.d(TAG, "onApplyWindowInsets");
if (insets.isRound()) {
Log.d(TAG, "Round");
} else {
Log.d(TAG, "Square");
}
}
I am using Android's MediaRouter / Presentation API (the support.v7 version).
Everything works fine so far. The only thing that doesn't is:
When I quit my activity (e.g.teardown & remove the callbacks), everything still works fine.
However, when starting this activity (the previous mediarouter-activity was forcefully finished, thus onPause/onDestroy was called FOR SURE => so those callbacks in there are gone too, as also shown in my debug messages) again at some later point in time, the callbacks get created and added and everything. Just, that there is no more onRouteAdded called, only onProviderChanged (With the default provider and thus useless).
It does always work like that (with wifi display [miracast], emulated secondary display, chromecast secondary display..). Are there any resolutions which are not in the examples?
Would you like to look at some code? Which special cases? (Can't post it all..)
I couldn't find anything so far, thanks for your help, in advance.
If you change the Google Cast sample app to support MediaRouter.Callback:
https://github.com/googlecast/CastPresentation-android
Then I'm getting the onRouteAdded called every time.
Using getSelectedRoute()instead of the RouteInfo (which is provided by the callbacks) did the job for me.
MediaRouter.RouteInfo selectedRoute = getHelper().getMediaRouter().getSelectedRoute();
if(provider != null && getCurrentRoute() != null && getCurrentRoute().equals(selectedRoute)){
Log.d(TAG, "only provider changes, dont do anything");
return false;
}
if (selectedRoute != null) {
setCurrentRoute(selectedRoute);
}
return updateContents();
this is definetly weird (as the rest of the code looks exactly as in the provided google android developer samples), but it works.
I know this problem was resolved over 1 year ago, but probably it isn't the perfect solution. Maybe it will be useful for somebody else.
I had similar problem with exactly the same symptoms (no more onRouteAdded called). In my situation it was caused by improperly implemented deactivation of MediaRouter: to deactivate it properly you should not only remove all of callbacks, but select default MediaRoute as well.
if (!mMediaRouter.getDefaultRoute().isSelected()) {
mMediaRouter.getDefaultRoute().select();
}
since android 4.2 google put a appops function into the Settings app ,but after a few weeks ,they close the entrance of this function, and they only close the entrance,the function code is also can be found in frameworks/base/services/java/com/android/server/appopsservice.java 。
But in the source code , I found there are two functions for check whether the operation can get the permission. one is startOperation() and another is noteOperation(), through reading the remark , i know that : startOperation is for a long-term permission check,and must call finishOperation after the operation is done. but the noteOperation is for a short-term.though i knew this but i also can't tell which time i should use startOperation and which time i should chose noteOperation .
Did anyone read this soucecode, and plz give me some suggesstion.
thank you in advance.
well , after a few days late , I found the answer finally.
the difference between startop and noteop is :
startop is prepare for some opereation that will work for a long time ,like gps,like vibrator and so on , you can start monitor a permission use startop and end monitor that use finishop , remember , finishop must be called after you end the monitor.
noteop is just for the short term operation , for example : check whether the app has the permission to send sms or receiver sms.
Obviously ,above is the difference between this two methods.
last but not least : ****