I've been tasked with adding support to an app for beaming large data files (tens of megabytes) from device to device via 'NFC' on Android.
I'm aware that genuine NFC on Android is painfully slow, but I know that ICS has support for doing hand-off of the bulk data transfer to Bluetooth; and Samsung have a proprietary mechanism for doing the same via Wifi Direct (S-Beam). So that's the approach I'd want to take.
Unfortunately I cannot find any information on how to actually do this.
I've looked at the Android Beam documentation, and there's no mention of special mechanisms to support large bulk data; so I took the standard AndroidBeamDemo app and simply added a large byte array to the packet size, in the hope that it would all Just Work. It seems not to --- sending a 10kB message takes about five seconds, and trying to send a 1MB message just doesn't do anything at all (although it tells me the message was sent successfully).
For Samsung's S-Beam, I simply cannot find any documentation whatsoever.
Has anybody made this work, and if so, can they point me at an example?
For Android Beam, you need to provide URIs to the files with the data using setBeamPushUris() (or setBeamPushUrisCallback() if the data is not fixed).
For S-Beam, I am not aware of any API that can be used. AFAICT, S-Beam only works with the built-in apps for pictures, video and music.
Related
I've learned that the NFC tag spec offers a few standardized formats (plain text, e-mail, Wi-Fi SSID, business card data, URL, etc.) that compatible phones are capable of natively responding to simply by enabling NFC functionality in OS settings.
Am I correct in understanding that no 3rd-party app installation is needed for this to function? Does this invariant hold true for both Android and iOS 11+? I see that iPhone 7 and up supports Core NFC.
If the above is true, my actual question follows.
I have an application to display some data from an RFID chip that would be deployed with moderate ubiquity.
Under normal circumstances, a dedicated app would be used to fetch and display this data, to guarantee consistency (everything's always in the same place) along with presentational clarity.
However, in rare situations, the devices (phones) scanning the RFID tags may not have the reader app installed. They may also not have any cellular connectivity, making app installation difficult. (This is a rare/unlikely but plausible edge-case.)
In such a scenario, would it be possible for the tag to deliver a "natively-actionable" piece of information to the phone, such as a plaintext fragment of data, similar to how NFC tags work?
It would be great if I could offer a URL and also a text fragment. Two actions on one NFC tag suggests this may be doable but it sounds like it's a hack exploiting undefined behavior (?).
The RFID tag itself does not need much onboard capacity; the maximum capability required would be retrieving a few hundred bytes of data that is occasionally overwritten. Some read and write counters would probably be the only other feature required.
Android devices will usually handle a few data types automatically using built-in apps. E.g. URLs are shown/opened by default without the need for a 3rd party app. Core NFC on iOS does not seem to have such a capability. NFC tags can only be read when an app explicitly starts reading.
It's possible to put multiple NDEF records on one tag. For Android, you would need to make sure that the first record can trigger an automatic action (i.e. is a URL) or that any record is an Android Application Record. Two actions on one NFC tag is certianly not a hack but exactly the way to do this.
I have a requirement to transfer images between two devices which do not have access to the internet/cellular data and are not on the same network.
I need this to work across Android and iOS, which rules out wifi-direct (Android) and multi-peer connectivity (iOS).
After lots of research, I've concluded that the p2pkit library is my best bet, however data transfer is limited to just 440 bytes which is nothing for an image.
What I'm about to do is create a process that splits images into chunks and transfers via p2pkit, but wanted to double check if anyone is aware of a better way/alternative libraries for achieving this? (Please don't suggest AllJoyn as I painfully learned devices must be on the same network!)
Thanks.
Desperately need help!
The problem is as follows: there is a bunch of medical diagnostic devices packed in a box. They are all fed from the same battery, and their data is supposed to be visualized on a Nexus tablet, also enclosed in the box. Only one device at a time is connected to a tablet. Connection is via USB port, processing off-line only. Data is streaming in real-time (some devices may have recording capability, some don't) and needs to be visualized in real-time also.
The devices are "dumb" and there are no SDKs. Seemingly, the devices were never intended to be connected to any external visualizer or any other device. All we have to work with is the raw stream of data - the output of a device is not even a file but a stream of 256 vectors. This stream needs to be captured, written to a predefined buffer/series of buffers (how to determine size of such buffer to be generic enough to satisfy every device in the box?), and then translated into some format that Android tablet can visualize.
Is my understanding of the required architecture correct? What language shall this software be written in? can it be done in something truly cross-platform like Python? Does there exist any open-source functionality for capturing a stream (if so, please, kindly recommend)? Is it possible to have such a software generic so that changing a device/tablet/OS could be accommodated without an excruciating pain?
I'm currently working on an app with the end goal of being roughly analogous to an Android version of Air Play for the iDevices.
Streaming media and all that is easy enough, but I'd like to be able to include games as well. The problem with that is that to do so I'd have to stream the screen.
I've looked around at various things about taking screenshots (this question and the derivatives from it in particular), but I'm concerned about the frequency/latency. When gaming, anything less than 15-20 fps simply isn't going to cut it, and I'm not certain such is possible with the methods I've seen so far.
Does anyone know if such a thing is plausible, and if so what it would take?
Edit: To make it more clear, I'm basically trying to create a more limited form of "remote desktop" for Android. Essentially, capture what the device is currently doing (movie, game, whatever) and replicate it on another device.
My initial thoughts are to simply grab the audio buffer and the frame buffer and pass them through a socket to the other device, but I'm concerned that the methods I've seen for capturing the frame buffer are too slow for the intended use. I've seen people throwing around comments of 3 FPS limits and whatnot on some of the more common ways of accessing the frame buffer.
What I'm looking for is a way to get at the buffer without those limitations.
I am not sure what you are trying to accomplish when you refer to "Stream" a video game.
But if you are trying to mimic AirPlay, all you need to do is connect via a Bluetooth/ internet connection to a device and allow sound. Then save the results or handle it accordingly.
But video games do not "Stream" a screen because the mobile device will not handle much of a work load. There are other problems like, how to will you handle the game if the person looses internet connection while playing? On top of that, this would require a lot of servers to support the game workload on the backend and bandwidth.
But if you are trying to create an online game. Essentially all you need to do is send and receive messages from a server. That is simple. If you want to "Stream" to another device, simply connect the mobile device to speakers or a TV. Just about all mobile video games or applications just send simple messages via JSON or something similar. This reduces overhead, is simple syntax, and may be used across multiple platforms.
It sounds like you should take a look at this (repost):
https://stackoverflow.com/questions/2885533/where-to-start-game-programming-for-android
If not, this is more of an open question about how to implement a video game.
Having a design discussion with some co-workers about our app. Looking for the best way to transfer large data files on a, say, weekly basis from a phone to a remote server. Server will
be in the DMZ and phone will either be in WiFi mode or GSM. Some of the files will be 100Mb can even get up to 400Mb. Just not sure of the best way to approach this in my Android code. I was looking at
MTOM or even just pure FTP. Any advice would be appreciated.
I have investiagated about the use of MTOM under Android but I found nothing. I don't know whether there's any implementation working with Android yet.
But this is something you can do by FTP, which would be a good choice I think. And you could check the integrity of the file using a checksum (calculated on both sides and then compared).
Using 3G for huge files is likely to take long and to be expensive, so to me the best is to use WiFi. You can detect whether your device is connected thru WiFi as described here.