From my initial research and experience with wireless chargers for my Nexus 7, it appears that there are issues with wireless chargers interfering and causing issues with NFC functionality. Note, I am not talking about doing both simultaneously. Rather, I am speaking of whether there are detrimental effects on the NFC antenna due to wireless charging over time. I gather that both inductive charging and NFC use the same coil on Nexus 7 (?)
I have personally seen wirelessly charged Nexus 7's NFC scans/tags not registered, but do not know for sure if these two are related for sure. My basic understanding is that the NFC antenna is right near the wireless charging area in the back of the device.
Moreover, wireless charging it's too slow; often times, it seems as though it's slowing down the battery drainage than actively charging it.
Questions to the community are
1- what's the best wireless charger out there? Specifically, one that does not interfere with NFC functionality both at a hardware and software level?
2- if the NFC functionality is hurt, what's the best way to troubleshoot it? I am experiencing inconsistent scans-- 1 out 5 scans are not registered and the device needs to be powe cycled.
We've had some experience with this behavior. Specifically with our NFC Patch Kit products rather than wireless charging. The issue is with the Automatic Gain Control (AGC) system built into NFC controllers. It's main purpose is to adjust the power output on the antenna coil to optimize the field while minimizing battery drain. Here's a snap shot from the BCM20793S spec sheet (same as BCM20793M in Nexus 7 only without Secure Element):
A “false alarm rate” can occur when the [low power target detect] can be triggered by
metallic objects, but any trigger will be qualified by a full poll
event and therefore can be discounted if it is not a real target. The
control algorithm includes a background calibration, so it
auto-adjusts to a background baseline to account for drift and
changing conditions.
So what's happening is that the Nexus 7 antenna is getting desensitized as the wireless charging pad is put in the field. If seems the Nexus 7 firmware stack is not smart enough to detect the wireless charging presence and deactivate the NFC controller altogether to prevent the problem. Or at least trigger a recalibration of the LPTD mode. We do this with our FloBLE product.
So to answer your questions explicitly:
what's the best wireless charger out there? Specifically, one that does not interfere with NFC functionality both at a hardware and software level?
In order to achieve good coupling between wireless charger and device, they need to have a tuned induction coil. So you can't really swap out the Nexus charger for another and expect to improve anything. The issues above will remain and you'll probably end up with a weakened recharge experience. That said, I've found the Duracell Powermats effective in good wireless charging performance.
if the NFC functionality is hurt, what's the best way to troubleshoot it? I am experiencing inconsistent scans-- 1 out 5 scans are not registered and the device needs to be power cycled.
This is a SW bug of the Nexus 7 low level NFC management system. You should be able to reset the LPTD calibration by enabling/disabling NFC from the device Settings screen. I've found power cycling is the only way to achieve it though. You could log a defect against the Nexus 7 2013 board support package (BSP) and reference this post.
Related
The Output voltage from an USB port is 5v, and it can't be changed, isn't it?!
But what is the maximum amperage that I can drain from a smartphone?
It depends on the battery, hardware or is limited by the OS?
There is difference between Android/iPhone/otherOS phones?
Can I control the output amperage with an App? And if the phone is rooted?
Important example what is the maximum amount of current that can provide Iphone 6 from USB?
The official maximum on current provided on USB 2 OTG port is 500mA and minimum is 8mA when the port is in Host mode.
Source http://www.usb.org/developers/onthego/otg1_0.pdf #2.5
The USB 2.0 specification allows the voltage output from a USB port to be anywhere between 4.40 V and 5.25 V, but it is typically 5 V.
While you could construct a USB port that has a different voltage, you should probably not do that because a normal USB device you plug into that port could malfunction or be damaged. It's unlikely that your phone provides a feature for changing the voltage of your USB port.
The USB specification allows devices to draw 100 mA from a USB port before they have reached the "configured" state (see Chapter 9 of the USB specification for more information about USB device states). Once the device is configured, it can draw more current, as long as it doesn't exceed the amount of current specified in its configuration descriptor.
If your phone does not get the device into the configured state by default, it might be possible to write an app to do it, thus allowing your device to draw more than 100 mA without violating the USB specification.
Not sure what you're trying to achieve...
An iPhone 6 can provide some (limited) power to devices plugged into its lightning port (it doesn't have a USB port).
There are cables that connect the phone to a host computer (or charger) over USB, but you're not going to get power out of the phone that way (only power into the phone).
A lightning connector uses active electronics inside the connector's cable to get the phone to assign functions to the different pins.
Moreover there's authentication involved, preventing you from making anything yourself for it without apple's involvement. You'd need to join the MFi program to get the needed documentation.
There's an NDA involved, and Apple is notorious about such things, so you're unlikely to find much reliable and/or verifiable info on the Lightning conector till you join the program.
See also here.
A list of USB standards and their voltage/current ratings is shown on wikipedia
The Nexus 5X and 6P seem to be the first phones to support power delivery via USB Type C connection: Link
However there is conflicting information on how well this is implemented.
Your goal of a 3A output current seems to be possible with USB-PD, but it would appear that no smartphones are currently supporting this.
Important example what is the maximum amount of current that can provide Iphone 6 from USB?
An iPhone with a Camera Connection Kit will limit you to 100 mA. If you connect a device that claims more than this in its descriptors, or tries to draw more than this, it will get cut off with an error:
As reported by this iPhone 6 and 6 plus may be able to supply a current of about 1 Ampere at the voltage at which they get charged. Though I can't be very sure about it unless the specification of the USB drive used are provided.
3A of currents seems to be way too much. If it would have been possible, they could have used it to charge mobile phones faster (I don't know if they do use 3A chargers or not, but this can be seen on their charger at the output tag ).
I would have not done such an experiment until and unless I was very sure about it.
Disclaimer : What ever experiment you do, you it do at your own risk. In the above answer I have just tried to explain what I would have done and have provided reason to back that.
Amperage drained from the power supply is proportional to these factors:
1- The net device's impedance(including battery and other feeder circuits).
2- Device's internal current limiter.
3- Power supply's maximum current limitation.
power supplies usually provide less amperage than it is dictated by the device. If they provide more of, their constant voltage, i.e. 5V, drops below its standard. Therefore, the limitation is posed by the supplies in the first place.
Devices, provide you with another current limiter to lower the risk of using failure of nonstandard power supplies. I'm not sure whether you can change the maximum threshold of internal current delimiter to drain more power or not, but I'm sure you can not change it in your power supply.
This blog post from radius networks discusses how Android devices can not yet be used as iBeacons (AKA: BLE peripheral mode) even if the device's hardware supports BLE, because Android has no APIs for BLE peripheral mode.
NOTE: BLE means Bluetooth Low Energy
After reading this section, I am gonna suggest something stupid
Because their SDK only supports the central role, “advertising” a
service as a central server means sitting their quietly, only
revealing (or “advertising”) its service characteristics to another
device in peripheral mode after a connection is already established.
This connection establishment requires another device to do the actual
radio advertising first. Samsung’s SDK isn’t going to do it.
Is it possible to trick the device in central mode (the Samsung phone) into thinking another device in peripheral mode has established a connection to it and then get the central mode to "advertise". Can you trick the phone by faking the connection in software?
Probably a stupid idea, probably it's possible to trick the device in central mode, but the "advertisement" is not the BLE peripheral mode advertisement and would not resemble an iBeacon in peripheral mode.
I very much need BLE peripheral mode support from Android and for the moment I would be ok with hacking something, in the hopes that Android is going to eventually support this feature-set, which BTW is already a feature request
I don't think this technique will work, even if it is possible to "trick" Android into thinking it has a connection to another BLE device. Although this question is about the Samsung BLE SDK, I think the same answer applies to the dedicated Android BLE APIs (android.bluetooth.BluetoothAdapter) that shipped starting with Android 4.3.
Let's put aside the tricking part for a minute -- even if Android's actually does have a connection to another BLE device, can you make it advertise? If by "advertising" you mean revealing its services to the other device, the answer is yes. But this radio transmission is over a private channel between the two devices, and could not be picked up by a BLE scan in the same way that iBeacon advertisements are. It really isn't an advertisement at all -- which is why I said in the blog post that the Samsung documentation is misleading. So if you can't do this with a real connection, then tricking a connection won't work either.
My understanding of bluetooth stacks is limited, but I think that the actual radio broadcast advertisements that scans can pick up are controlled by the hardware BLE chipset, which means you have to have access to the parts of the bluetooth stack that tell the hardware to make this happen. I don't think there is anything in Samsung BLE SDK or the Android SDK that intentionally allows this.
An alternative may be to make direct JNI calls to the BlueDroid stack. I'm not sure if Android permissions allow this, or if advertising is even implemented in the BlueDroid stack.
UPDATE: Transmitting as a pheripheral now possible in Android 4.4.3 and Android L. See here.
We would like to connect sixteen vibrators to an Android phone using Bluetooth, and control the vibrators individually.
As far as I know, you can only have eight devices in a piconet, so that would place a limit of seven vibrators (the phone itself being the eighth device). First of all: Is that correct?
And do up to seven connected devices work well and reliably in Android? Or is there some additional limit or problems from Android's Bluetooth implementation or APIs?
For our sixteen vibrators, will we have to build a scatternet with additional devices that bridge between the phone's piconet and additional piconets with some of the vibrators? Does anyone have experience with this, and does it work well?
(And no, it's not a sex toy!)
As far as I know, you can only have eight devices in a piconet, so
that would place a limit of seven vibrators (the phone itself being
the eighth device). First of all: Is that correct?
Ok to be technically precise - Bluetooth Classic can connect and be in active connection with upto 7 devices
at a time. But then an active device can then be put in park mode and it can have a large number of device in park modes, so device can be moved to park from the connected - active state and vice versa.
But again at any one point you can have only 7 active devices So the master device should manage a large number of devices by keeping (unto 7 ) active and rest parked and keep switching them between active and parked modes.
And do up to seven connected devices work well and reliably in
Android? Or is there some additional limit or problems from Android's
Bluetooth implementation or APIs?
Well in Android the problem is - There is no one implementation and many different bluetooth Radio hardware gets used by different manufacturers. So the answer is it depends. Some are pretty reliable Some are really bad.
But there are no public APIs to control / use the Park mode that I described above - But if you can operate on the internals or have access to it from your app you could do what you are asking for,
On Scatternet :
Again Android does not have any API for you to control it, It will be complicated - but your could force it into a scatternet configuration, but again there are limits - the best I have seen in commercial devices is for a device to be in 2 or 3 piconets at the same time, Which means you can be connected to (7+2) 9 devices at a time (it does not meet your requirement of 16).
Bridging / Mesh configuration may be feasible - Where 2 of your devices form their own piconetcs i.e with 8 devices in each group then the leader of the group (Master) connects to Android deevice - and you manage the data relay at the application.
Now having said all this - have you looked at Bluetooth Low Energy - A perfect candidate to conenct a bunch of sensor devices - Ther is no theoritical limit on the number of devices that can be connected at a time - But practically 16 or even larger is very feasible.
Android currently does not have public APIs for it . (As of Today)
But most (almost all) latest adroid devices comes with Bluetooth Hardware that is Version 4.0 meaning it is capable of Bluetooth Low Energy.
And iOS devices - Mac, iPhone , iPad has great support and developer access / apis for it.
So it will be the way to go, and I am pretty Sure Android will come with developer APIs soon for BLE (atleast I hope so)
I'm contemplating the development of an Android app that detects all or most nearby mobile devices (iPhone, Android, etc) in the immediate neighborhood that are turned on. I don't need to interact with these devices, just detect them, and a requirement is that the detected devices can't need to have any special / unusual apps installed on them. The app only needs to work for typical U.S. devices and networks.
I've thought about a few ways to do this (somehow detecting bluetooth, wifi, or cellular transmissions / identifiers), but I'm looking for specific implementation methods for a way to detect a relatively large proportion of nearby devices. I'm not sure which of these methods is possible / feasible or how to put them into practice...
Perhaps using Bluetooth: Is there a way using the Android SDK to detect non-discoverable Bluetooth devices (not in discoverable mode)? The Nokia Developer site seems to suggest this is possible using Service Discovery Protocol (SDP), but I'm not sure if this is possible more generally in Android.
Perhaps using cell tower mast switching simulation? Ok, this is almost certainly beyond the reach of Android, but this article suggests that there may be a way to "mimic cell mast switching process to trigger quiescent phones into transmitting. Phones respond with their ID and authentication signals...".
I think you should see this, it is a paper, and you cannot view it for free, but in the summary, it clearly states:
Concerns about Bluetooth device security have led the specification of the “non-discoverable” mode, which prevents devices from being listed during a Bluetooth device search process. However, a nondiscoverable Bluetooth device is visible to devices that know its address or can discover its address. This paper discusses the detection of non-discoverable Bluetooth devices using an enhanced brute force search attack. Our results indicate that the average time to attack a non-discoverable Bluetooth device using multiple search devices and condensed packet timing can be reduced to well under 24 hours.
But for an android application, you need the detection time to be well under a few seconds instead of less than 24 hours, so a practical solution may not yet be available.
I want to facilitate video-calling from the android device to another android device. My question is that can i connect the android WiFi device with the android WiFi device without any use of internet connection. I want to use it just like the Skype. is this possible or not? if it is possible then how can i implement it...can i get some code snippets as well???? Please give me link to download that app
First, your idea works completely different from Skype, which is completely dependent on a functional Internet connection for its core functionality.
Second, while you could create an ad-hoc WiFi network betweeen two Android devices, their range will be the limiting factor:
WiFi is intended as a short-range wireless medium. There's a reason nobody wanted the 2.4 GHz band (and therefore it is unlicensed): there's a significant noise and signal loss on these frequencies, noticeable even at short range.
Moreover, wireless equipment in mobile devices is engineered for power efficiency - which translates to lower broadcast power when compared to on-the-grid devices.
Also, the antennae in such devices are omnidirectional - this is rather useful for normal use, but again lowers your available broadcast power
Even if you had huge, high-quality directional external antennae connected to each device, pointing very precisely at each other (btw that also means each of them is stuck in one place; see e.g. this for a dish size calculator), you'd need to make some pretty drastic changes to their networking stack, as the latency inherent in long-distance comms will screw up TCP/IP pretty badly.
Even so, the setup would be very brittle, dependent even on the weather (water vapour absorbs significant amount of power in that part of the spectrum).