I use OkHttp 3 on Android 4.4.
Most answers I found are about caching the content of the HTTP responses of some request.
I want to cache the requests themselves. So when I am offline for some time and then the connection gets better, all requests are sent.
How do I do this with OkHttp? Is there a build-in request queue?
I read about silent-retry but am not sure how it behaves exactly. What if my connection is bad for e.g. 6h? What if I have 30 pending requests?
You might want to look at Tape.
Related
I am currently storing my transactions in SqLite DB if there is no internet connection at the moment and when there is internet available i send the pending transactions with the new one made so making it quite a lot of data and a lot of API hits which chokes the device and it becomes unresponsive. So need help with a proper way to sync these transactions to the server. Also these are being sended to a Socket as well as a Server.
I tried using AsyncTask for it but it also causes problems if transactions are above 200. Tried Retrofit for it and to some extent the count exceeded from 200 to almost 350 but the issue and unresponsivness remains.
You should give a try to PriorityJobScheduler lib or WorkManager from JetPack.
When there is no network connection you can queue those request and those request will be send ones network connection is available. (So you dont need to wait until someone made new transaction to send old queued data too)
Also, in your current scenario, rather than sending single request for each transaction, ask your API Guy to accept request in List of object format. So you just need to create list of request body object and send to server
What is the best way to make a deferred simple okhttp3 web request like:
Request request = Request.Builder().url(url).post(body).build();
Response response = client.newCall(request).execute();
that gets executed as soon as a client goes online and only then to handle result?
For example, client makes change to the data locally and this change should be posted on server. But there are cases when user does it offline, but we still have to handle it.
You can use a broadcast receiver which listen for internet connectivity changes, and Deferred library for deferring the request Jdeferred.
If use of deferred is too much work and overkill, You can use a conutdownlatch and
wait on separate thread until, the connectivity change, but the wait time can be huge, so chose wisely.
You can also consider CompletableFuture. But this is available from api 24 only
Let me know if you need more clarification.
I am using retrofit on the android side and node.js on the backend side.
Retrofit allows async requests, so can I send two requests from android in parallel on the server and get the result back, or I can send one request and use caolan async to run two different requests in parallel.
Which request should I use from retrofit? Should I send parallel requests or execute them as one request in parallel?
I think the second approach is better because the first approach increases server requests which adds to the server load.
Please tell me, which approach should I use?
Your milage may vary but in general batching saves opening and closing multiple connections. I've had gains in performance once I started batching multiple requests together rather than having to open and close a connection for each request. Best way to find out for your use case is to write a test for both and compare.
I'm using Volley as network library on Android. I ran into 'limited functionality' problems when using ImageLoader. It seems to be quite useful class with caching and stuff, so I want to continue using it. However, it doesn't give any access to the Request objects it creates. As a result, I'm not able to do some stuff that I can do in other cases (like setting a tag on the request for cancelling it from queue).
My current problem is - how can I set a retry policy on requests made using ImageLoader?
I think there is no way to set retry policy for ImageLoader. But you have access to all requests through volley singleton (if you use one). Try to change retry policy in addToRequestQueue method. If you need different retry specification for images and other requests - you can simply create two request queues (bad practice).
I have 2 class of AsyncTask for handling request one for sync and other class for handling other request but when I send sync request and move to other page and request response for second request will get after sync request responds. How I solution this?
Hope any one help me
Basically you can use Volley
Volley offers the following benefits:
Automatic scheduling of network requests.
Multiple concurrent network connections.
Transparent disk and memory response caching with standard HTTP
cache coherence.
Support for request prioritization.
Cancellation request API. You can cancel a single request, or you
can set blocks or scopes of requests to cancel.
Ease of customization, for example, for retry and backoff.
Strong ordering that makes it easy to correctly populate your UI
with data fetched asynchronously from the network.
Debugging and tracing tools.
You can easily find a tutorial for it and
It much faster then AsyncTask .
For reference check this
You can make asyntask run parallel execution by replacing execute() with executeOnExecutor(AsyncTask.THREAD_POOL_EXECUTOR).