I'll try to sum this up briefly. I have an app that can synchronize its data with a USB flash drive connected via an OTG adapter. The problem is this: some files will end up in a folder called LOST.DIR in the root of the flash drive. They are all exactly 4KB large and have random, 3-number names with no file extension. I know they contain data from my app because if I open them in notepad, I can see the data that my app is outputting. This data, however, is sometimes mixed with random symbols. Based on my rudimentary knowledge of file systems, the consistent size of the files and random content makes me think these are blocks of memory marked bad by Android and moved to this folder.
There is one caveat: I am treating the flash drives as if they are hot-swappable, as they would be on a Windows device. I understand that it may not be valid to think of them as this. To get around this, I'm calling running the sync command via su after I finish reading and writing to and from the flash drive. My understanding is that this should sync the in-RAM buffer/cache with the physical flash drive, thereby making it safe to remove. This may be a faulty assumption.
So, my question is two-fold:
What is causing data to randomly disappear and be moved to LOST.DIR?
Is it safe to be treating flash drives as hot swappable? If not, is there a way to make them behave that way?
If my question is not clear enough or you need more information, I can clarify things for you. Thank you.
What comes to mind is that you are not the only one accessing the flash drive (media scanner, for instance), so sync will flush buffers but something could be ongoing, and sync exits anyway. I think you should also unmount it (and it will fail until it's really safe to remove).
Related
I know there are ways to take screenshot programmatically.
1. use MonkeyRunner through ADB and use USB cable for connection between PC and Android device.
http://developer.android.com/tools/help/MonkeyDevice.html#takeSnapshot
2. get from drawing cache for your developed APP only
https://stackoverflow.com/a/24280494/2080233
3. get from frame buffer, but not work for many app.
http://www.pocketmagic.net/android-native-screen-capture-application-using-the-framebuffer/
4. use /system/bin/screencap by root, but need to write into storage.
https://stackoverflow.com/a/15208592/2080233
I want to know if there is a way to grab a screenshot directly from memory and work great.
Your analysis of #4 is not quite accurate, as the usage information for screencap states:
If FILENAME is not given, the results will be printed to stdout
Therefore if you can invoke screencap as a user with sufficient permissions, you can capture the output of the process and do whatever you like with that data.
I believe this is what the screenshot button in DDMS ultimately does, piping the data back through the adb connection. Note however that if you pass binary data by way of ADB's standard output, you may have to undo a CRLF translation inserted in dubious consideration of Windows users.
My phone runs Android 4.0.4 and is in Media Transfer Protocol (MTP) mode. My app is attempting to overwrite the same text file on the phone’s SD card with progressing larger files, using successive calls of the following code:
File mDir = new File(Environment.getExternalStorageDirectory(), "Location");
File mFile = new File(mDir, "Location.txt");
PrintWriter mPW = new PrintWriter(mFile);
mPW.println(sData); // sData is a string of a few hundred characters
mPW.flush();
mPW.close();
During testing in USB debugging mode, I had the following problem. In Windows 7, I could open a file written with the latest data, but the Date modified’s time and Size corresponded to an earlier write of the file and I could only see as many characters as corresponded to that size. Effectively new data in an old file.
I scoured the Web but no one seems to have reported this sort of problem. I tried lots of fancier ways of writing the file, including using buffered classes and even deleting and recreating the file before re-writing it, all to no avail.
After re-booting the phone, the problem seems to have gone away, but I wonder if this rings a bell with anyone.
What is published via the MTP interface is what the MediaStore knows about the file. The MediaStore will eventually pick up your changes, faster if you use something like MediaScannerConnection to proactively tell it that the file has changed.
However, what MTP clients choose to do is up to them. In general, they seem to cache results, and I am not aware of whether MTP has any sort of "push" semantics to tell the MTP client "hey, something you asked for a bit ago has changed". Some might reload contents based upon a "reload" or "refresh" option in the MTP client UI. Others might assume the contents are unchanging, until such time as the device is unplugged and re-plugged in.
In general, I would not advocate your approach (continuously appending to the same file, with the user expecting to be able to see those changes in real time), simply due to the apparent limitations of MTP.
all
When I try to read some media file from sd card after the first time I insert to the device, the read performance is much worse than the second time, does anybody have any idea about this phenomena, and how can I avoid this problem, I tried open and fopen, but the results are the same, I just want read performance is the same, no matter when I insert SD card, thanks
Using O_DIRECT (see open(2)) when opening the file will bypass the buffer cache. This is often not a good idea, but I would expect it to be more consistent from run to run.
Keep in mind that using O_DIRECT requires that the memory read into be SC_PAGESIZE aligned and read in blocks which are multiples of SC_PAGESIZE.
Are you saying it's worse for the first read than subsequent reads before you remove the device? If so, this is normal - it's due to buffering. Basically the system is using the system RAM to speed up the perceived speed of the device.
If you remove the card after unmounting it and then put it back and remount it I would expect the first read would again be slower, then subsequent reads would appear to be faster again.
So I'm creating a project on an Android Galaxy Tab 10.1 that uses its accelerometers and gyroscopes. One of the steps here is to collect alot of data and determine its accuracy and drift. To do this I need to move the device and allow the sensors to get their readings, and then take those readings and put them through some analysis.
My admittedly primitive way of doing this is to send each reading as an error to the log, copy the log from Eclipse, paste it in notepad, and format it, getting rid of other unwanted errors and timestamps. This method is not very good, and the log in Eclipse deletes the old log entries if the list gets too long, meaning I can only look at about 30 seconds of data.
I need much more time than this so I thought of a couple ways to fix the issue:
If I could somehow write the readings directly from the Android to a document on the computer im using, that would solve all my problems.
If i could write the readings to a file on the Android that could be saved and later transfered to the computer, that would also work
Writing a file to the internal SD card is quite easy and would probably solve your second point. You can access the file later via USB, which just mounts the internal SD card like an USB thumb drive.
For example, this question and answer gives you some starting points.
Having a design discussion with some co-workers about our app. Looking for the best way to transfer large data files on a, say, weekly basis from a phone to a remote server. Server will
be in the DMZ and phone will either be in WiFi mode or GSM. Some of the files will be 100Mb can even get up to 400Mb. Just not sure of the best way to approach this in my Android code. I was looking at
MTOM or even just pure FTP. Any advice would be appreciated.
I have investiagated about the use of MTOM under Android but I found nothing. I don't know whether there's any implementation working with Android yet.
But this is something you can do by FTP, which would be a good choice I think. And you could check the integrity of the file using a checksum (calculated on both sides and then compared).
Using 3G for huge files is likely to take long and to be expensive, so to me the best is to use WiFi. You can detect whether your device is connected thru WiFi as described here.