Low resolution Thumbnails management in Chat Applications - android

I am dubious about the approach usually followed by popular Chat applications like Whatsapp, Wechat etc. It is seen that these apps share a low -resolution blurred out image of the actual image/video file. How is this managed?
My concern is the space management at the server end. Does the client process the original image and create a low-res version and sends 2 requests (Original+blurred file) to the server? Following which, the blurred image being lower in size is shared with others who trigger a GET request for the original image/video file.
Or does the server itself does some processing one the Original file received and make a low-res version out of it. And proceeds as above.
In both cases i could think of, space is being eaten up at the server end with 2 instances of each image/video being shared.
Kindly let me know how this is genrally carried out.
Would be grateful!

you need to upload your original files on your server(web server) and using web server you can send thumbnail of file base64 to ejabberd server. create blurred image from client side not server side(less work load on server if you do it client side).In this case you need to create custom ejabberd module via this custom module http server communicate from ejabberd server.

Related

Recover an image from an android to send to server

I am currently working on an android application and I would like to add a file system to send images. But I don't know how to get an image from the local storage of an android, nor how to send it to a server. I am currently using a flask server to retrieve JSON files, and I don't know if retrieving images would be possible with this kind of server.
Would you have any clue how to retrieve a file on a button press and send it to a server please?
You can use this library to pick from local storage and upload it to the server with ease
https://github.com/gotev/android-upload-service
At this point, it doesn't matter what server you use as long as you can retrieve data.
You can do it using Retrofit. For detailed instruction, check This medium post!

How do I upload large files to AWS S3 in chunks using Sinatra?

I have written an API in Ruby/Sinatra that connects with our Android app through http(s) endpoints. One action involves uploading (zlib-compressed) files from the Android app to our S3 bucket via the application server. Using the aws-sdk gem, I've been doing so using a single POST /files endpoint in which the Android client sends a zlib-compressed, base64-encoded string of the entire file in one go. This isn't a best practice on both Android and for our backend though, and I'd like to be able to read file data in chunks on a POST request and upload each part in a multipart upload to S3. I tried successfully implementing a Sinatra streaming GET route, but I'm unable to do the same for my purpose. How do I achieve this? And is Sinatra streaming + S3 multipart-upload the correct way to go?

how to configue my apache2 webserver?

Am trying to develope an android application that send images to my server for processing reasons ,what is the best way to configure it inorder to recieve data from android users(am using http req)
Thanks
Apache is simply a webserver. Generally when you send images they are sent as POST data. This means you'll need to make sure that Apache can handle the sizes of the images that are sent to it. The directive LimitRequestBody controls this limit for Apache. It's set to '0' by default. This means it wont limit the POST body size.
The next thing you need is have a server side script take care of the processing, storage and response to the client. You may use ruby (Rails), PHP, Java to accomplish this. Each framework has default limits. For instance PHP has 'upload_max_filesize' directive which needs to be set if you expect the images beyond the default 2MB. This is set in the php.ini.

Android + NodeJS: Client-Server communication

I have some questions about developing a Android application which shall be able to communicate with a NodeJS server.
The Android application gathers some data and saves everything in a .csv file.
This file now needs to be uploaded to a NodeJS server. The NodeJS server should save the file as well as storing the content in a MongoDB.
My question now is how I should implement the communication between the Android device and the server.
I know how to upload a single file to a NodeJS server using a HttpURLConnection with a DataOutputStream.
But I need more than just uploading the file because I need a unique identification of each Android device.
I thought about using the (encrypted) Google account E-Mail address of the user to distinguish the devices. I am not interested in knowing who uploads which data but I need to store the data for each device separately.
The problem is that I don't know how to communicate between the device and the server.
If I upload a file via HttpURLConnection and DataOutptStream it seems that I can only upload the file without any additional information like the unique key for the device.
I also thought about uploading the file via sockets. But I am not sure how to handle huge file sizes (5 MB or more).
I am not looking for code fragments. I rather need some hints to the right direction. Hopefully my problem was stated clearly and someone can help me with this.
Using a HttpUrlConnection on the Android side, and a RESTful server on the Node side would be a straightforward option.
You can embed information into the URL in a RESTful way:
pathParam: www.address.com/api/save/{clientId}/data
queryParam: www.address.com/api/save/data?c={clientID}
each uniquely identifying the client. This can be whatever scheme you choose. You will have to build the HttpUrlConnection each time as the URI is unique, and important!
The server side can then route the URL however you see fit. Node has a number of packages to aid in that (Express, Restify, etc.). Basically you'll grab the body of the request to store into your DB, but the other parameters are available too so it's all a unique and separated transaction.
Edit: The package you use for RESTful handling can stream large files for you as well. Processing of the request can really begin once the data is fully uploaded to the server.
Using a socket would be nearly just as easy. The most difficult part will be 'making your own protocol' which in reality could be very simple.
Upload 1 file at at time by sending data to the socket like this:
54::{filename:'myfilename.txt',length:13023,hash:'ss23vd'}xxxxxxxxxxx...
54= length of the JSON txt
:: = the delimiter between the length and the JSON
{JSON} = additional data you need
xxx... = 13023 bytes of data
Then once all the data is sent you can disconnect... OR if you need to send another file, you know where the next set of data should be.
And since node.js is javascript you already have wonderful JSON support to parse the JSON for you.
Would I suggest using a socket? Probably not. Because if you ever have to upload additional files at the same time, HTTP and node.js HTTP modules might do a better job. But if you can guarantee nothing will ever change, then sure, why not... But that's a bad attitude to have towards development.

How would I use blobstore to process android images

Can someone please clarify this for me. I am reading the developer page about the blobstore at https://developers.google.com/appengine/docs/java/blobstore/overview. I can't seem to wrap my head around the process of saving and retrieving blobs? It sounds like
android app would directly send an image to the blobstore
after saving the image, the blobstore would then return a blobkey to my backend for me to put in the datastore
Is that the process? Maybe it's because I have had a long day, but I just can't see it. If someone has an example they don't mind sharing, please post it. I just need to save images from android in the blobstore and then be able to retrieve them with blobkey or otherwise.
I have already look at
Upload to Appengine Blobstore in Android
Using Google BlobStore with an Android application
Android Interaction with Google App Engine Blobstore Service
What is the syntax to get a Blobstore upload url from Android?
For the life of me, I don't know why they are not doing it for me.
I suppose some questions are:
How does android know where to send the blob to? I mean, does Google distinguish between my instances of the blobstore versus other people's instances, similar to how it distinguishes my instances of the datastore? In other words could I go to app engine Applications Overview and see all the blobs that belong to my app the way I could in the datastore? I suppose a complete, working piece of code could help me see these answers.
Part of my problem could be that I have never used servlet. I am presently using Google Cloud Endpoint for my api.
Actually there are two ways to upload to blobstore:
Using direct upload handler:
Server gets a unique one-time secret upload url via createUploadUrl(..) and sends this url to client.
Client uses multipart/form-data POST to upload data to this url.
The upside is that you can upload large files (>32mb).
Using blobstore FileService API which is deprecated and should not be used any more:
You create you own POST upload handler where client uploads data.
You use FileService API so save data to blobstore.
The downside is that you can upload max 32mb of data (generic GAE request limit).
The upside is that you have access to data so you can edit contents if needed.
Your description of the process is correct. The only step you miss is the first: the server side calls blobstoreService.createUploadUrl(redirecturl) to generate the URL to upload to. Then the handler at redirecturl will save the blob key to the datastore.

Categories

Resources