Unable to override Here SDK to disable sound effect on the onSpeedExceeded event.
Using the Here Developer tutorial, (https://developer.here.com/blog/android-premium-sdk-speed-limit-warning-example), I succeeded in running the sample app. But...
While driving, when I exceed the speed limit, there is a doot doot doot. I want to override this behaviour as I intend to use my own sounds.
I guessed that I might override the code by creating a NavigationManager.SpeedWarningListener. Unfortunately I can not disable or defeat the 'onSpeedExceeded' sound effects.
NavigationManager.SpeedWarningListener speedWarningListener = new NavigationManager.SpeedWarningListener() {
#Override
public void onSpeedExceeded(String s, float v) {
//super.onSpeedExceeded(s, v);
//Log.v(Global.TAG, "onSpeedExceeded");
Global.SpeedLimitExceeded = true;
}
#Override
public void onSpeedExceededEnd(String s, float v) {
//super.onSpeedExceededEnd(s, v);
//Log.v(Global.TAG, "onSpeedExceededEnd");
Global.SpeedLimitExceeded = false;
}
};
EDITED ANSWER: This method needs to be amended to stop the speed warning:
private void startNavigationManager() {
NavigationManager.Error navError = NavigationManager.getInstance().startTracking();
// added by suggestion from stackoverflow
NavigationManager.getInstance().stopSpeedWarning();
if (navError != NavigationManager.Error.NONE) {
Log.d(Global.TAG, "NavigationManager: false");
//handle error navError.toString());
} else {
//Log.d(Global.TAG, "NavigationManager: true");
}
}
Please set speedWarningEnabled accordingly for NMANavigationManager
navigationManager:didUpdateSpeedingStatus:forCurrentSpeed:speedLimit: will be sent to the delegate when speeding is detected or when a correction is made.
Also refer http://developer.here.com/documentation/ios-premium/api_reference_jazzy/Classes/NMANavigationManager.html
Related
I'm developing Android app on Android studio using Opencv library and when I try to open my app it opens then right after that it closes and displaying crash message. I'm new on mobile development
Using : OpenCV310, Android Studio 3.0,
public class ScanLicensePlateActivity extends AppCompatActivity {
protected AnylineOcrScanView scanView;
private LicensePlateResultView licensePlateResultView;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
//Set the flag to keep the screen on (otherwise the screen may go dark during scanning)
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
setContentView(R.layout.activity_anyline_ocr);
String license = getString(R.string.anyline_license_key);
// Get the view from the layout
scanView = (AnylineOcrScanView) findViewById(R.id.scan_view);
// Configure the view (cutout, the camera resolution, etc.) via json
// (can also be done in xml in the layout)
scanView.setConfig(new AnylineViewConfig(this, "license_plate_view_config.json"));
// Copies given traineddata-file to a place where the core can access it.
// This MUST be called for every traineddata file that is used
// (before startScanning() is called).
// The file must be located directly in the assets directory
// (or in tessdata/ but no other folders are allowed)
scanView.copyTrainedData("tessdata/GL-Nummernschild-Mtl7_uml.traineddata",
"8ea050e8f22ba7471df7e18c310430d8");
scanView.copyTrainedData("tessdata/Arial.traineddata", "9a5555eb6ac51c83cbb76d238028c485");
scanView.copyTrainedData("tessdata/Alte.traineddata", "f52e3822cdd5423758ba19ed75b0cc32");
scanView.copyTrainedData("tessdata/deu.traineddata", "2d5190b9b62e28fa6d17b728ca195776");
// Configure the OCR for license plate scanning via a custom script file
// This is how you could add custom scripts optimized by Anyline for your use-case
AnylineOcrConfig anylineOcrConfig = new AnylineOcrConfig();
anylineOcrConfig.setCustomCmdFile("license_plates.ale");
// set the ocr config
scanView.setAnylineOcrConfig(anylineOcrConfig);
// initialize with the license and a listener
scanView.initAnyline(license, new AnylineOcrListener() {
#Override
public void onReport(String identifier, Object value) {
// Called with interesting values, that arise during processing.
// Some possibly reported values:
//
// $brightness - the brightness of the center region of the cutout as a float value
// $confidence - the confidence, an Integer value between 0 and 100
// $thresholdedImage - the current image transformed into black and white
// $sharpness - the detected sharpness value (only reported if minSharpness > 0)
}
#Override
public boolean onTextOutlineDetected(List<PointF> list) {
// Called when the outline of a possible text is detected.
// If false is returned, the outline is drawn automatically.
return false;
}
#Override
public void onResult(AnylineOcrResult result) {
// Called when a valid result is found
String results[] = result.getText().split("-");
String licensePlate = results[1];
licensePlateResultView.setLicensePlate(licensePlate);
licensePlateResultView.setVisibility(View.VISIBLE);
}
#Override
public void onAbortRun(AnylineOcrError code, String message) {
// Is called when no result was found for the current image.
// E.g. if no text was found or the result is not valid.
}
});
// disable the reporting if set to off in preferences
if (!PreferenceManager.getDefaultSharedPreferences(this).getBoolean(
SettingsFragment.KEY_PREF_REPORTING_ON, true)) {
// The reporting of results - including the photo of a scanned meter -
// helps us in improving our product, and the customer experience.
// However, if you wish to turn off this reporting feature, you can do it like this:
scanView.setReportingEnabled(false);
}
addLicensePlateResultView();
}
private void addLicensePlateResultView() {
RelativeLayout mainLayout = (RelativeLayout) findViewById(R.id.main_layout);
RelativeLayout.LayoutParams params = new RelativeLayout.LayoutParams(
ViewGroup.LayoutParams.WRAP_CONTENT, ViewGroup.LayoutParams.WRAP_CONTENT);
params.addRule(RelativeLayout.CENTER_HORIZONTAL, RelativeLayout.TRUE);
params.addRule(RelativeLayout.CENTER_VERTICAL, RelativeLayout.TRUE);
licensePlateResultView = new LicensePlateResultView(this);
licensePlateResultView.setVisibility(View.INVISIBLE);
mainLayout.addView(licensePlateResultView, params);
licensePlateResultView.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
startScanning();
}
});
}
private void startScanning() {
licensePlateResultView.setVisibility(View.INVISIBLE);
// this must be called in onResume, or after a result to start the scanning again
scanView.startScanning();
}
#Override
protected void onResume() {
super.onResume();
startScanning();
}
#Override
protected void onPause() {
super.onPause();
scanView.cancelScanning();
scanView.releaseCameraInBackground();
}
#Override
public void onBackPressed() {
if (licensePlateResultView.getVisibility() == View.VISIBLE) {
startScanning();
} else {
super.onBackPressed();
}
}
#Override
protected void onDestroy() {
super.onDestroy();
}}
source code is here.
If possible please help.
Logcat error shown here
Ideally more information regarding the error would be best i.e the opencv library version etc. Given it seems to be an Android issue, I would advise
File and issue or view issues pertaining to this error on their github page. Search for related Android errors to see if they match.
IF you cannot find a related error, file an issue there.
I am using Vuforia 6.2 AR SDK for in Unity. But while I test the application in Android phone the camera seems like blurry. I searched in Vuforia's developer website and found some camera focus mode but I can't implement because that guideline was for older Vuforia SDK, I can't find the script they mentioned in their website. Here is their code sample but it's not working. I created different script and run this line on Start() function, but still not working.
CameraDevice.Instance.SetFocusMode(
CameraDevice.FocusMode.FOCUS_MODE_CONTINUOUSAUTO);
try this
void Start ()
{
VuforiaARController.Instance.RegisterVuforiaStartedCallback(OnVuforiaStarted);
VuforiaARController.Instance.RegisterOnPauseCallback(OnPaused);
}
private void OnVuforiaStarted()
{
CameraDevice.Instance.SetFocusMode(
CameraDevice.FocusMode.FOCUS_MODE_CONTINUOUSAUTO);
}
private void OnPaused(bool paused)
{
if (!paused) // resumed
{
// Set again autofocus mode when app is resumed
CameraDevice.Instance.SetFocusMode(
CameraDevice.FocusMode.FOCUS_MODE_CONTINUOUSAUTO);
}
}
This code is the right code.
bool cameramode = false;
public void OnCameraChangeMode()
{
Vuforia.CameraDevice.CameraDirection currentDir = Vuforia.CameraDevice.Instance.GetCameraDirection();
if (!cameramode) {
RestartCamera(Vuforia.CameraDevice.CameraDirection.CAMERA_FRONT);
camBtnTxt.text = "Back Camera";
} else {
RestartCamera(Vuforia.CameraDevice.CameraDirection.CAMERA_BACK);
camBtnTxt.text = "Front Camera";
}
}
private void RestartCamera(Vuforia.CameraDevice.CameraDirection newDir)
{
Vuforia.CameraDevice.Instance.Stop();
Vuforia.CameraDevice.Instance.Deinit();
Vuforia.CameraDevice.Instance.Init(newDir);
Vuforia.CameraDevice.Instance.Start();
}
I can check to see if the burn-in protection property is enabled, but is there a way to tell when burn-in mode is currently active? Like specifically when the screen shifts.
Basically something like "onAmbientModeChanged" for burn in.
Thanks!
In an activity, extend WearableActivity and override onEnterAmbientMode, you have in parameter a Bundle where you can retrieve the property wanted.
(check this WearableActivity)
#Override
public void onEnterAmbient(Bundle ambientDetails) {
super.onEnterAmbient(ambientDetails);
boolean burnIn = ambientDetails.getBoolean(EXTRA_BURN_IN_PROTECTION);
boolean lowBit = ambientDetails.getBoolean(EXTRA_LOWBIT_AMBIENT);
}
In a CanvasWatchFaceService.Engine, override onPropertiesChanged :
#Override
public void onPropertiesChanged(Bundle properties) {
super.onPropertiesChanged(properties);
boolean lowBit = properties.getBoolean(PROPERTY_LOW_BIT_AMBIENT, false);
boolean burnIn = properties.getBoolean(PROPERTY_BURN_IN_PROTECTION, false);
}
Override onAmbientModeChanged(boolean inAMbientMode), it is called whenever the watchface switches from interactive to ambient mode and vice versa :
#Override
public void onAmbientModeChanged(boolean inAmbientMode) {
super.onAmbientModeChanged(inAmbientMode);
if (mState.isAmbient() != inAmbientMode) {
mState.setAmbient(inAmbientMode);
//make your updates on your drawing parameters if needed
invalidate();
}
}
I'm developing application that views books. There is a screen (Activity) which shows a book. It has custom view, something similar to ViewSwitcher and every page is a bitmap that is rendered by a custom View.
Now I should implement accessibility function - book should be read by the phone (audio).
I've read Accessibility section here https://developer.android.com/guide/topics/ui/accessibility/index.html but it is not clear enough.
I use SupportLibrary for accessibility management and now I have this code in ViewGroup (which manages book pages). Code 1:
private class EditionPagesViewSwitcherAccessibilityDelegate extends AccessibilityDelegateCompat {
private int mPageCount;
private double[] mPageRange;
#Override
public void onInitializeAccessibilityEvent(final View host, final AccessibilityEvent event) {
super.onInitializeAccessibilityEvent(host, event);
event.setClassName(EditionPagesViewSwitcher.class.getName());
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.ICE_CREAM_SANDWICH) {
event.setScrollable(canScroll());
}
if (event.getEventType() == AccessibilityEventCompat.TYPE_VIEW_SCROLLED && updatePageValues()) {
event.setItemCount(mPageCount);
// we use +1 because of user friendly numbers (from 1 not 0)
event.setFromIndex((int) (mPageRange[0] + 1));
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.ICE_CREAM_SANDWICH) {
event.setToIndex((int) (mPageRange[1] + 1));
}
}
}
#Override
public void onInitializeAccessibilityNodeInfo(final View host, final AccessibilityNodeInfoCompat info) {
super.onInitializeAccessibilityNodeInfo(host, info);
info.setClassName(EditionPagesViewSwitcher.class.getName());
info.setScrollable(canScroll());
info.setLongClickable(true);
if (canScrollForward()) {
info.addAction(AccessibilityNodeInfoCompat.ACTION_SCROLL_FORWARD);
}
if (canScrollBackward()) {
info.addAction(AccessibilityNodeInfoCompat.ACTION_SCROLL_BACKWARD);
}
}
#Override
public boolean performAccessibilityAction(final View host, final int action, final Bundle args) {
if (super.performAccessibilityAction(host, action, args)) {
return true;
}
switch (action) {
case AccessibilityNodeInfoCompat.ACTION_SCROLL_FORWARD: {
if (canScrollForward()) {
showNext();
return true;
}
}
return false;
case AccessibilityNodeInfoCompat.ACTION_SCROLL_BACKWARD: {
if (canScrollBackward()) {
showPrevious();
return true;
}
}
return false;
}
return false;
}
Here is code from page view Code 2:
#Override
public void onInitializeAccessibilityEvent(final View host, final AccessibilityEvent event) {
super.onInitializeAccessibilityEvent(host, event);
event.setClassName(EditionPageView.class.getName());
if (hasText()) {
event.getText().add(getPageRangeText());
final String trimText = mSurfaceUpdateData.getPageText().trim();
if (trimText.length() > MAX_TEXT_LENGTH) {
event.getText().add(trimText.substring(0, MAX_TEXT_LENGTH));
// event.getText().add(trimText.substring(MAX_TEXT_LENGTH, trimText.length()));
}
else {
event.getText().add(trimText);
}
}
}
#Override
public void onInitializeAccessibilityNodeInfo(final View host, final AccessibilityNodeInfoCompat info) {
super.onInitializeAccessibilityNodeInfo(host, info);
info.setClassName(EditionPageView.class.getName());
}
Because page text data loads asynchronous first time accessibility don't have any text while executes onInitializeAccessibilityEvent code. And then when data have been loaded I fire AccessibilityEvent.TYPE_VIEW_SELECTED and AccessibilityEvent.TYPE_VIEW_TEXT_CHANGED events. Then onInitializeAccessibilityEvent executes again and phone "read" book text.
So my questions:
Is my Accessibility implementation right? May be it is design wrong? Because I didn't find any good tutorial about this feature.
Why I need to use SDK versions checks in Support implementations in Code 1? Why support implementation doesn't handle it correctly?
Is firing TYPE_VIEW_SELECTED and TYPE_VIEW_TEXT_CHANGED really needed? Or may be some other code should be implemented?
The main question. In Code 2 there is commented code line. This code statement substring text to be less then MAX_TEXT_LENGTH (it's 3800) because if text is bigger nothing is played. Nothing. Is it accessibility restriction? Any other text that is less then this value is played well.
Does anyone know where I can find any good tutorial? (yes I saw samples).
Does anyone have any custom realizations to look through?
UPDATED
Well. Here is some answers:
As I can see TYPE_VIEW_SELECTED and TYPE_VIEW_TEXT_CHANGED events are not needed if you don't want this text to be read as soon as you get it.
On Nexus 7 all large text is played well (text up to 8000 symbols), so this issue doesn't reproduce on it, but on Samsung Galaxy Tab 10.1 (Android 4.0.4) and Genymotion emulator of Tab 10.1 with Android 4.3 does. And this is strange...
4.. According to the documentation of String.substring()
The first argument you pass is the start index in the original string, the second argument is the end index in the original string.
Example:
String text = "Hello";
partOfText = text.substring(2,text.length() - 1);
partOfText equals to "llo" (the first char is index 0)
So by putting your constant MAX_TEXT_LENGTH as a first argument, it would start at index 3800 to take out the substring.
http://developer.android.com/reference/java/lang/String.html#substring(int)
You are right MAX_TEXT_LENGTH is 3800.
About your doubt,
this code:
event.getText().add(trimText.substring(MAX_TEXT_LENGTH, trimText.length()));
}
you are trying to substring "trimText" from MAX_TEXT_LENGTH to trimText.length() !
Supposing that trimText = "STACK", trimText.length() = 5, then trimText.substring(3800,5) is going to be ?
At first, this doesn't have sense, using correctly would be like this:
trimText.substring(0,2) = "ST";
How can you read data, i.e. convert simple text strings to voice (speech) in Android?
Is there an API where I can do something like this:
TextToVoice speaker = new TextToVoice();
speaker.Speak("Hello World");
Using the TTS is a little bit more complicated than you expect, but it's easy to write a wrapper that gives you the API you desire.
There are a number of issues you must overcome to get it work nicely.
They are:
Always set the UtteranceId (or else
OnUtteranceCompleted will not be
called)
setting OnUtteranceCompleted
listener (only after the speech
system is properly initialized)
public class TextSpeakerDemo implements OnInitListener
{
private TextToSpeech tts;
private Activity activity;
private static HashMap DUMMY_PARAMS = new HashMap();
static
{
DUMMY_PARAMS.put(TextToSpeech.Engine.KEY_PARAM_UTTERANCE_ID, "theUtId");
}
private ReentrantLock waitForInitLock = new ReentrantLock();
public TextSpeakerDemo(Activity parentActivity)
{
activity = parentActivity;
tts = new TextToSpeech(activity, this);
//don't do speak until initing
waitForInitLock.lock();
}
public void onInit(int version)
{ //unlock it so that speech will happen
waitForInitLock.unlock();
}
public void say(WhatToSay say)
{
say(say.toString());
}
public void say(String say)
{
tts.speak(say, TextToSpeech.QUEUE_FLUSH, null);
}
public void say(String say, OnUtteranceCompletedListener whenTextDone)
{
if (waitForInitLock.isLocked())
{
try
{
waitForInitLock.tryLock(180, TimeUnit.SECONDS);
}
catch (InterruptedException e)
{
Log.e("speaker", "interruped");
}
//unlock it here so that it is never locked again
waitForInitLock.unlock();
}
int result = tts.setOnUtteranceCompletedListener(whenTextDone);
if (result == TextToSpeech.ERROR)
{
Log.e("speaker", "failed to add utterance listener");
}
//note: here pass in the dummy params so onUtteranceCompleted gets called
tts.speak(say, TextToSpeech.QUEUE_FLUSH, DUMMY_PARAMS);
}
/**
* make sure to call this at the end
*/
public void done()
{
tts.shutdown();
}
}
Here you go . A tutorial on using the library The big downside is that it requires an SD card to store the voices.
A good working example of tts usage can be found in the "Pro Android 2 book". Have a look at their source code for chapter 15.
There are third-party text-to-speech engines. Rumor has it that Donut contains a text-to-speech engine, suggesting it will be available in future versions of Android. Beyond that, though, there is nothing built into Android for text-to-speech.
Donut has this: see the android.speech.tts package.