The Output voltage from an USB port is 5v, and it can't be changed, isn't it?!
But what is the maximum amperage that I can drain from a smartphone?
It depends on the battery, hardware or is limited by the OS?
There is difference between Android/iPhone/otherOS phones?
Can I control the output amperage with an App? And if the phone is rooted?
Important example what is the maximum amount of current that can provide Iphone 6 from USB?
The official maximum on current provided on USB 2 OTG port is 500mA and minimum is 8mA when the port is in Host mode.
Source http://www.usb.org/developers/onthego/otg1_0.pdf #2.5
The USB 2.0 specification allows the voltage output from a USB port to be anywhere between 4.40 V and 5.25 V, but it is typically 5 V.
While you could construct a USB port that has a different voltage, you should probably not do that because a normal USB device you plug into that port could malfunction or be damaged. It's unlikely that your phone provides a feature for changing the voltage of your USB port.
The USB specification allows devices to draw 100 mA from a USB port before they have reached the "configured" state (see Chapter 9 of the USB specification for more information about USB device states). Once the device is configured, it can draw more current, as long as it doesn't exceed the amount of current specified in its configuration descriptor.
If your phone does not get the device into the configured state by default, it might be possible to write an app to do it, thus allowing your device to draw more than 100 mA without violating the USB specification.
Not sure what you're trying to achieve...
An iPhone 6 can provide some (limited) power to devices plugged into its lightning port (it doesn't have a USB port).
There are cables that connect the phone to a host computer (or charger) over USB, but you're not going to get power out of the phone that way (only power into the phone).
A lightning connector uses active electronics inside the connector's cable to get the phone to assign functions to the different pins.
Moreover there's authentication involved, preventing you from making anything yourself for it without apple's involvement. You'd need to join the MFi program to get the needed documentation.
There's an NDA involved, and Apple is notorious about such things, so you're unlikely to find much reliable and/or verifiable info on the Lightning conector till you join the program.
See also here.
A list of USB standards and their voltage/current ratings is shown on wikipedia
The Nexus 5X and 6P seem to be the first phones to support power delivery via USB Type C connection: Link
However there is conflicting information on how well this is implemented.
Your goal of a 3A output current seems to be possible with USB-PD, but it would appear that no smartphones are currently supporting this.
Important example what is the maximum amount of current that can provide Iphone 6 from USB?
An iPhone with a Camera Connection Kit will limit you to 100 mA. If you connect a device that claims more than this in its descriptors, or tries to draw more than this, it will get cut off with an error:
As reported by this iPhone 6 and 6 plus may be able to supply a current of about 1 Ampere at the voltage at which they get charged. Though I can't be very sure about it unless the specification of the USB drive used are provided.
3A of currents seems to be way too much. If it would have been possible, they could have used it to charge mobile phones faster (I don't know if they do use 3A chargers or not, but this can be seen on their charger at the output tag ).
I would have not done such an experiment until and unless I was very sure about it.
Disclaimer : What ever experiment you do, you it do at your own risk. In the above answer I have just tried to explain what I would have done and have provided reason to back that.
Amperage drained from the power supply is proportional to these factors:
1- The net device's impedance(including battery and other feeder circuits).
2- Device's internal current limiter.
3- Power supply's maximum current limitation.
power supplies usually provide less amperage than it is dictated by the device. If they provide more of, their constant voltage, i.e. 5V, drops below its standard. Therefore, the limitation is posed by the supplies in the first place.
Devices, provide you with another current limiter to lower the risk of using failure of nonstandard power supplies. I'm not sure whether you can change the maximum threshold of internal current delimiter to drain more power or not, but I'm sure you can not change it in your power supply.
Related
For a research project, I must work on the design of 3G/WiMAX Access Point (AP) protocols. I have access to 2 Android smartphones.
Android devices can become WiFi hotspots for other devices, allowing WiFi tethering.
In this scheme, does the Android devices behave like a regular WiFi AP (like the one present in your home WLAN) or does it just forward the connection from the AP it is connected to ?
Similarily, is there a way of turning one of my smartphone into a 3G/WiMAX AP, so that the other can connect to it as if it was a 3G/WiMAX tower ? If so, can you provide some references and resources that would allow to do that ?
EDIT:
Ok so the keyword is "AP mode", it is a mode available in some Wireless Network Drivers like ath9k, bcm but is absent from most others. It is used to turn the device into an AP (or more accurately a simulated one).
The answer of the first question "Does it behave like an AP?" is Yes: the Android device can behave exactly like an AP (management frames, ack, etc.) AND internaly it forwards the connection from the real AP.
The answer to the second question "Can we turn it on ?" is "It depends on your wireless driver, and thus on your wireless card". Some drivers (ath9k, bcmon) have the options, some others don't. You will have to search for your network card name and check it out.
In my case, I wanted to change the behaviour of on AP (by modifying the driver / kernel), but found out that my driver is a firmware (and proprietary). I ended up adding an USB Wifi adapter that is supported by the driver ath9k (open-source driver for Atheros chipsets).
It behaves like a 'regular' AP, taking into account that both Android and WiFi are constantly evolving and hence the definition of regular may be a moving target.
You can test the behaviour by finding a location with no 2G/3G/4G etc coverage and run a local network with your phone as the AP for multiple devices. An long distance flight for example (where devices on flight laws allow...).
For testing I am not aware of any free or open source 3G network elements yet, but there are some 2G open source projects it might be worth you looking at:
http://openbsc.osmocom.org/trac/
http://openbts.org
The former has definitely mentioned plans for an open RNC (3G access point) but it is not clear what stage these are at.
Turning a smartphone into a 3G access point (RNC) is definitely not going to be just a matter of developing an app. Given the hardware is probably all there, I am sure it is technically possible if you are prepared to write a lot of low level software, but I think you would probably find it much easier to look at the approaches the above project have taken and see what their plans are for 3G.
From my initial research and experience with wireless chargers for my Nexus 7, it appears that there are issues with wireless chargers interfering and causing issues with NFC functionality. Note, I am not talking about doing both simultaneously. Rather, I am speaking of whether there are detrimental effects on the NFC antenna due to wireless charging over time. I gather that both inductive charging and NFC use the same coil on Nexus 7 (?)
I have personally seen wirelessly charged Nexus 7's NFC scans/tags not registered, but do not know for sure if these two are related for sure. My basic understanding is that the NFC antenna is right near the wireless charging area in the back of the device.
Moreover, wireless charging it's too slow; often times, it seems as though it's slowing down the battery drainage than actively charging it.
Questions to the community are
1- what's the best wireless charger out there? Specifically, one that does not interfere with NFC functionality both at a hardware and software level?
2- if the NFC functionality is hurt, what's the best way to troubleshoot it? I am experiencing inconsistent scans-- 1 out 5 scans are not registered and the device needs to be powe cycled.
We've had some experience with this behavior. Specifically with our NFC Patch Kit products rather than wireless charging. The issue is with the Automatic Gain Control (AGC) system built into NFC controllers. It's main purpose is to adjust the power output on the antenna coil to optimize the field while minimizing battery drain. Here's a snap shot from the BCM20793S spec sheet (same as BCM20793M in Nexus 7 only without Secure Element):
A “false alarm rate” can occur when the [low power target detect] can be triggered by
metallic objects, but any trigger will be qualified by a full poll
event and therefore can be discounted if it is not a real target. The
control algorithm includes a background calibration, so it
auto-adjusts to a background baseline to account for drift and
changing conditions.
So what's happening is that the Nexus 7 antenna is getting desensitized as the wireless charging pad is put in the field. If seems the Nexus 7 firmware stack is not smart enough to detect the wireless charging presence and deactivate the NFC controller altogether to prevent the problem. Or at least trigger a recalibration of the LPTD mode. We do this with our FloBLE product.
So to answer your questions explicitly:
what's the best wireless charger out there? Specifically, one that does not interfere with NFC functionality both at a hardware and software level?
In order to achieve good coupling between wireless charger and device, they need to have a tuned induction coil. So you can't really swap out the Nexus charger for another and expect to improve anything. The issues above will remain and you'll probably end up with a weakened recharge experience. That said, I've found the Duracell Powermats effective in good wireless charging performance.
if the NFC functionality is hurt, what's the best way to troubleshoot it? I am experiencing inconsistent scans-- 1 out 5 scans are not registered and the device needs to be power cycled.
This is a SW bug of the Nexus 7 low level NFC management system. You should be able to reset the LPTD calibration by enabling/disabling NFC from the device Settings screen. I've found power cycling is the only way to achieve it though. You could log a defect against the Nexus 7 2013 board support package (BSP) and reference this post.
If I make one device (e.g. Android tablet) indefinitely discoverable and make a second one (e.g. phone with BL 3.0 support) search for devices, it seems like I'll be able to extract server device name (http://developer.android.com/guide/topics/connectivity/bluetooth.html). By device name, I can deduct where in the world is the second device. What is wrong with sucn an approach? (I'm completely new to Android, just validating feasibility of some idea).
The problem with BLE is - 70% of android devices still have OS version < 4.3.
You can certainly do it within limits - I have some software running on my Mac that automatically locks the screen when my Bluetooth phone goes out of range. It doesn't use BLE.
One issue that you will have is the power consumption is greater than BLE, so battery life may be affected.
Another is that as the transmit power of the older Bluetooth can be higher than BLE you may find that the devices stay "in contact" for longer than you would like - It can certainly give you an idea of 'proximity/presence' although not really 'location'.
Also, if you pair the devices then they should be able to scan for one another without needing to be 'discoverable' - this is the approach used by the Mac software I mentioned. I am not familiar enough with the Android BT APIs to know whether you can detect a paired device but then not connect.
Also, taking the device MAC from the advertisement rather than name is probably 'safer' - A user may rename their device, but they won't change the MAC.
We would like to connect sixteen vibrators to an Android phone using Bluetooth, and control the vibrators individually.
As far as I know, you can only have eight devices in a piconet, so that would place a limit of seven vibrators (the phone itself being the eighth device). First of all: Is that correct?
And do up to seven connected devices work well and reliably in Android? Or is there some additional limit or problems from Android's Bluetooth implementation or APIs?
For our sixteen vibrators, will we have to build a scatternet with additional devices that bridge between the phone's piconet and additional piconets with some of the vibrators? Does anyone have experience with this, and does it work well?
(And no, it's not a sex toy!)
As far as I know, you can only have eight devices in a piconet, so
that would place a limit of seven vibrators (the phone itself being
the eighth device). First of all: Is that correct?
Ok to be technically precise - Bluetooth Classic can connect and be in active connection with upto 7 devices
at a time. But then an active device can then be put in park mode and it can have a large number of device in park modes, so device can be moved to park from the connected - active state and vice versa.
But again at any one point you can have only 7 active devices So the master device should manage a large number of devices by keeping (unto 7 ) active and rest parked and keep switching them between active and parked modes.
And do up to seven connected devices work well and reliably in
Android? Or is there some additional limit or problems from Android's
Bluetooth implementation or APIs?
Well in Android the problem is - There is no one implementation and many different bluetooth Radio hardware gets used by different manufacturers. So the answer is it depends. Some are pretty reliable Some are really bad.
But there are no public APIs to control / use the Park mode that I described above - But if you can operate on the internals or have access to it from your app you could do what you are asking for,
On Scatternet :
Again Android does not have any API for you to control it, It will be complicated - but your could force it into a scatternet configuration, but again there are limits - the best I have seen in commercial devices is for a device to be in 2 or 3 piconets at the same time, Which means you can be connected to (7+2) 9 devices at a time (it does not meet your requirement of 16).
Bridging / Mesh configuration may be feasible - Where 2 of your devices form their own piconetcs i.e with 8 devices in each group then the leader of the group (Master) connects to Android deevice - and you manage the data relay at the application.
Now having said all this - have you looked at Bluetooth Low Energy - A perfect candidate to conenct a bunch of sensor devices - Ther is no theoritical limit on the number of devices that can be connected at a time - But practically 16 or even larger is very feasible.
Android currently does not have public APIs for it . (As of Today)
But most (almost all) latest adroid devices comes with Bluetooth Hardware that is Version 4.0 meaning it is capable of Bluetooth Low Energy.
And iOS devices - Mac, iPhone , iPad has great support and developer access / apis for it.
So it will be the way to go, and I am pretty Sure Android will come with developer APIs soon for BLE (atleast I hope so)
I want to facilitate video-calling from the android device to another android device. My question is that can i connect the android WiFi device with the android WiFi device without any use of internet connection. I want to use it just like the Skype. is this possible or not? if it is possible then how can i implement it...can i get some code snippets as well???? Please give me link to download that app
First, your idea works completely different from Skype, which is completely dependent on a functional Internet connection for its core functionality.
Second, while you could create an ad-hoc WiFi network betweeen two Android devices, their range will be the limiting factor:
WiFi is intended as a short-range wireless medium. There's a reason nobody wanted the 2.4 GHz band (and therefore it is unlicensed): there's a significant noise and signal loss on these frequencies, noticeable even at short range.
Moreover, wireless equipment in mobile devices is engineered for power efficiency - which translates to lower broadcast power when compared to on-the-grid devices.
Also, the antennae in such devices are omnidirectional - this is rather useful for normal use, but again lowers your available broadcast power
Even if you had huge, high-quality directional external antennae connected to each device, pointing very precisely at each other (btw that also means each of them is stuck in one place; see e.g. this for a dish size calculator), you'd need to make some pretty drastic changes to their networking stack, as the latency inherent in long-distance comms will screw up TCP/IP pretty badly.
Even so, the setup would be very brittle, dependent even on the weather (water vapour absorbs significant amount of power in that part of the spectrum).