I am trying to use the network to file plugin. Since the images come from my Firebase Storage, it speeks for itself that I do not want to overload my available bandwith so I want to use this pluging to save files locally as to not re-load them everytime.
The problem is, the Widget requires a File to store it (of course).
NetworkToFileImage(
url: "http://example.com/someFile.png",
file: myFile)
But I cannot specify a path, because the only way to get a writable location is through getApplicationSupportDirectory() which is an async method. A huge workaround would be to make tons of things asynchronous, but it seems weird that getting this location (which is not an IO operation, merely returning a constant String) is asynchronous. In Android Studio with Java for example, getting the folder can be done through synchronously through context.getFilesDir(). So why not with Flutter?
Related
In my Android App, there is a list of some objects which is fetched from the server. Each of this object JSON contains some textual data along with a set of image and video URLs. The use case is, user can download this whole data from a single click and would be stored in a local storage.
I am trying different ways to make this happen using Work manager but I am not able to find a way to track this download progress as a whole to show in the UI as there are multiple files being downloaded. Also, how can I make sure if the whole data of that specific object is downloaded completely from the downloaded files in case any of them failed?
track this download progress as a whole to show in the UI as there are multiple files being downloaded.
In old days we need to use the ForegroundService, but now it's available on WorkManager. For using WorkManager with "ForegroundService" feature, check out these articles:
WorkManager: use foreground service for executing long running tasks
Use WorkManager for immediate background execution
Support for long-running workers
how can I make sure if the whole data of that specific object is downloaded completely from the downloaded files in case any of them failed?
You can do null check on certain field.
I am using axios for API calls, and Async storage for storing some details like user profile and all. But Async storage seems very slow.
My apps' key functionality needs internet to work, but lot of screens would have content which doesn't update in realtime or always. How do I save all that info in local so whenever new session of the app is launched, the screens show fully loaded instead of querying data from API and showing a loader, the info update can happen in the background. Like how all the major apps work is by not loading already loaded content but somehow saving it on the device itself, but at the same time allowing for updating if new info comes in.
Some sort of DB in the app to store all info would work? if yes, what's the best to go with Redux, Axios implementation that is fast and not as slow as Async Storage?
Async storage should work fine. To implement this, you will need to make sure to use the API data only on the first app start, and not subsequent starts. Then add an updater function that checks for updates. Here's a rough implementation.
When App loads, try to get your data from the local storage. If it's the first start, there's nothing in the storage, so fetch the data from API.
Once data comes back. Save it to the local storage for next app start. Display the data.
If the data exists in local storage (subsequent starts), you will need to send a API call to fetch the latest data. Once the data comes back, put it in the local storage. Make sure this is not a blocking call. It's up to you if you want to update the current data on screen.
I have worked on several projects using the redux-persist library. In short, what it does is cache the information you handle with redux, without having to worry about saving information with Async storage.
If you really don't want to use Async storage and you already use Redux, I think it's the perfect option. I have been with this library for about 2 years and so far I have not had any problems.
For more information visit:
https://github.com/rt2zz/redux-persist
I am creating a data repository layer in my application which in turn serves data from two sources, either network API calls or getting data from local storage. When there is a request to the data repository layer, I first need to check if the data is present in the local storage or not. If not, I then make a call to my API to get the data. To check if the data is in local storage or not, I was thinking of two ways to implement this:
On app startup, I load all the data in the database into the memory (eg. lists, maps etc.) and then whenever I have to check of existence of data in the local storage, I check from the memory. This has a possible problem that I have faced earlier as well. When the app is in background, Android system, might clear up this allocated memory to serve memory to other apps. This makes things complicated for me.
Whenever I want to check the existence of data in the local storage, I directly make queries to SQL tables and decide based upon that. This is much more leaner and cleaner solution than the above mentioned case. The only worry for me here is the performance hit. I wanted to know if the SQLite database runs after being loaded into the memory or not? If yes, then the memory layer of data that I had created above is useless. If not, is it safe to keep on querying the SQLite database for each and every small data request?
SQLite caches some data, and the Android OS will keep the database file in its file cache, as long as there is enough memory.
So when using SQLite, your app's performance is automatically optimized, depending on available memory.
I've been trying to figure out what I'm doing wrong or if it's the intended behaviour. I am creating an app that uploads files to the Google Drive using Google Play Services Drive API. Before I create the parent folder I check if the folder already exists. For testing I delete the folder, then check in the app, however the app always detects that the folder is still there.
I have checked isTrashed on the MetaDataBuffer and it's always reported as false.
Is this a syncing issue between app and drive server?
This is the query that I use:
new Query.Builder().addFilter(
Filters.and(
Filters.eq(SearchableField.TRASHED, false),
Filters.eq(SearchableField.MIME_TYPE, MIME_TYPE_FOLDER),
Filters.eq(SearchableField.TITLE, name)
)).build();
Welcome to the club :-). This issue is one of the known quirks (pardon, features) of GDAA discussed all over the place. What you see is the side-effect of having a 'buffering' layer between your app and GooDrive. When placing any GDAA related request, you're talking to GooPlaySvcs layer, that has no clue what other apps (http://drive.google.com) did to the Drive. So many weird things can happen, like for instance:
1/ create a folder with your app
2/ trash and permanently delete that folder using http://drive.google.com
3/ let your app create a file in that folder - no indication of failure, file is created in a folder somewhere in Google never-never-land
If you really need to know the current Drive status, you have to poll the Drive using the REST Api.
Another option to keep it under control is to write your folder/files into an appfolder, making them inaccessible to other apps. Then, not even you can delete a file 'from behind'. Be careful, though, there is a related problem/issue with the appfolder you can run into when testing.
Good Luck
As Sean mentioned, GDAA has a caching layer that can cause your app to get stale query results.
The way to ensure that the cache is up to date before you query it is to call DriveApi#requestSync.
The caching layer allows your app to operate when offline. If the device is offline, the Drive API will upload your file when the device goes back online.
The REST API is indeed more straightforward and can be a good option if you do not care about the offline case or is happy to handle it in your app.
Hope this helps.
This questions is a bit old but the answer might be useful for others having the exact same problem.
Before making any query / operation that requires an up to date metadataBuffer you must call DriveClient.requestSync() to update it.
But be careful when using this method. According to the documentation below:
public abstract Task<Void> requestSync ()
Requests synchronization
with the server to download any metadata changes that have occurred
since the last sync with the server.
Typically, this method should be called when the user requests a
refresh of their list of files. Once this method returns, performing a
query will return fresh results.
In order to avoid excessive load on the device and server, sync
requests are rate-limited. If the request has been rate-limited, the
operation will fail with the DRIVE_RATE_LIMIT_EXCEEDED status. This
indicates that a sync has already occurred recently so there is no
need for another. After a sufficient backoff duration, the operation
will succeed when re-attempted.
Source: Google APIs for Android.
My implementation is as follows:
private void refreshMetadata(final File databaseFile) {
mDriveClient.requestSync()
.addOnSuccessListener(aVoid -> checkIfBackupFolderIsInDrive(databaseFile))
.addOnFailureListener(e -> Utils.log("SettingsFragment", "Could not update metadata buffer: " + e.getMessage()));
}
I request the sync. If successful I start the operations that I want to do. If it fails right now I'm only logging the event.
I am integrating dropbox in my android application. The requirement is to get all file metadata. I have downloaded dropbox sdk and I am able to authenticate user's credentials as well.
I am unable to get all files using this. this provides metadata of only root folder contents
mDBApi.metadata("/", 0, null, true, null);
I need to chain the api calls to get metadata of all files and folders. Now I want to do it lazily, like get 100 files, dump the metadata in local cache and get next 100 files metadata (irrespective of file location, i.e. no matter where the file is, in root or under 10 folders)
I am using "AccessType.DROPBOX" access to get all the contents. I know that there is a parameter for specifying how many files to fetch in one go ("0" in the above statement) but I want to keep fetching metadata till all data has been received in my client. Is there a way for this or a sample code may be.
Thanks.
The Dropbox API metadata call is meant only for retrieving a single file or folder's metadata. The best practices page indicates that you should not recursively call this endpoint:
https://www.dropbox.com/developers/reference/bestpractice
You may instead be interested in using the delta call to build and maintain your local state:
https://www.dropbox.com/developers/core/api#delta