I have a project and I'm curious what the community would recommend as the best approach to implementing it.
Process:
On an android device allow the user to assemble an image composite that includes a background, some clip art (positioned and/or rotated/sized by the user), maybe a photo, and some text.
do some stuff...
print the image on the server.
The only mandatory requirement is that the server HAS to be a PC Laptop machine, though I can replace Windows with linux, if I need to (don't ask).
Should I go down the road of attempting to construct a finished JPG right on the Android device and delivering that to the server for printing? If so, what Java image library is best for such a task?
Or should I try to implement imagemagik on the server (is that even possible if I kept the Windows that's pre-installed on it?) or some other image automation?
I could build this as an Adobe AIR app and run it on Android, but am worried about the 100% portability of Air onto Android and don't have the time to get stuck 1/2 way into something that involved only to have to start from scratch with a totally different approach.
All ideas welcome.
Well Android actually comes with a Jpeg codec installed and a png codec. As for the UI work to overlay the images into the correct locations that's up to you. But to construct the final version of the image, you can simply grab your main raster (android.graphics.Bitmap) and use Bitmap.CompressFormat (JPG or PNG) to save it to a stream somewhere. Then the file can be sent to the server. I mean if the raster is relatively small (which I assume it is or else you wouldn't really be able to provide a simple UI for them to composite the image) this will work.
Hope this helps.
Related
I need to download large audio files from a web server from within a corona sdk app, and have been looking at using network.download() or possibly network.request() for this purpose.
Because much of the user base will be in areas that have poor or intermittent network coverage, I would like to improve robustness of the download process by allowing for resuming download from where it left off if the network drops out.
From the documentation neither network.download or network.request functions seem to support this directly, but is there a way that I can use either of these functions to achieve what I'm looking for? If not, is there another technique I can use?
My app will eventually be for both iOS and Android, but for now I am developing the iOS version first. Therefore, I am ok with having to use a different solution for each platform if there is not an easy solution that covers both platforms. However, I would prefer not to have to use native code if possible as I don't currently have an Enterprise subscription for Corona.
I don't think you can do this using Corona without native plugins.
One way to go around this problem is to split the large files in smaller ones, and I don't mean creating multiple audio files just splitting the large audio files in smaller files.
Then you can download one chunk at a time and when one fails, by getting an error in the handler or waiting a reasonable timeout and then checking to see if the file is present in the file system, you can start downloading it again.
After all chunks are download you can recreate the large file by using the ltn library.
You can read some more about the ltn12 library here and I think you need to take a close look to the Pumps method.
On your server-side, create a simple program that splits an audio file in multiple sub-files of the max size you would like to specify.
On your client-side, create a function that collides multiple chunks in one single audio files.
For example if you would like to limit your file size to 1MB, create a server side program that splits any audio file above 1MB in chunks: a 4.5MB file would be split in part1 1MB, part2 1MB, part3 1MB, part4 1MB, part5 0.5MB.
Glue the chunks together in one single file in your lua code when you have fetched that with network.request.
I don't know this title is clearly reflect the question content. Please help rephrase if you think it is confusing. Thanks!
This is a generic question about some common architecture. I am working on a Android app that can take and share photo just like Instagram. And I have a backend web service powered by Django.
The issue I am facing is, in the app, I will need to show different resolution of image (for example, thumbnails for profile image, mid-resolution for previews, and full resolution for expanded image views.) I want to ask a common pattern about how to make this happen.
I have two proposals on doing this task, and not sure which way I should go:
1.When user upload photos from the mobile app, I can compress it locally and send 3 different sizes (low res thumbnails, mid res, and high res). So server side can store them and return them back in different cases. The Con of this approach that I can think about is it will drain more user's data usage because now user will need to send multiple images. It may also cause uploading take more time and higher impact in user experience.
2.When user upload photos from the mobile app, they only upload the original image. The server side logic will do the compression for each income image, and store them accordingly. The Con of this approach is server may need to carry a lot more workload. In this case, if user increases a lot, it may crash the server.
I am hoping to get some pointer on this issue, or any reference about this topic will be helpful!
Upload the full size image to the server and have the server do the heavy lifting. Create three version of the image (small medium and large) and store them on the server (or content delivery network). Create a database table to keep track of the image id for each image and its various versions.
With android you wont know how powerful the phone running your app is, never assume it can handle whatever image manipulation you need, also try and keep network I/O as low as possible.
Alan - Think of smart phones as relatively dumb terminals made for purpose of consuming content. Most of the business logic and heavy processing should me implemented on server side.
In terms of architecture, you are facing scalability problem. You cannot expand cpu/memory or storage of a device to any level you want. However you can scale your servers horizontally or vertically by adding more ram/cores/disks etc. You can put cluster on server and have a farm of servers if data increases to that level.
So its always advisable to just upload the original image without locally processing. Locally processing also drains battery besides other downside you mentioned.
Again you business logic or processing technique changes, you have to redeploy all apps. Where as on server side such deployments are very well in your control.
What i would do is a first resize in the mobile, just one, because you dont one to transfer 13Mpx picture and also you dont want in your server a picture 2 screens wide (this is the first thing instagram does). After that, upload the file, and have the server doing the rest of the stuff.
this avoid a lot of that usage
ensures that will work fine in any device (every one has different capabilities, so you cant trust them for heavy work)
You can change some decisions about settings or configurations, and all this is centralized, if you change any common behavior, you don't need all the devices to get the last version of the app.
About crashing the server, there is one first measure i would take:
Don't do any operation, like re-sizing or making copies of the image in the moment of the upload, you can choose one of this approaches:
Do it when there is a request for the file. this is, dont do it when the user sends it, but when somebody needs it.
Do it in a continuos backgorund process, like a cronjob running every minute, for instance.
or a combination of the two, there is a process in the background doing the stuff for all the pending images, but if somebody enters in the website and they need an image that is not yet generated, it is generate in that moment.
I am looking to develop a transit app using GTFS static data. One of the constraints I've set to myself is that the app should use minimal mobile data transfers. Therefore, I would like to embed all the data in the app.
My issue is that GTFS data sets are usually quite large (85MB uncompressed for the city of Sydney for example). I've done a bit of reverse engineering on other apps out there and found out that some of them have managed to compress all that data into a much smaller file (I'm talking about a few MB at most).
Using 7zip, I've managed to compress my 85MB data set down to 5MB which is acceptable for me. The next step is for me to use that 7z file into my app and that's where I'm stuck. There's no way I'm going to uncompress it and put it in a SQL database as that will use too much space on the phone. So I was wondering what are my other options.
Thanks
First, for embedding, I recommend using the Embedded XZ library (similar to 7zip). I have embedded this in a project and had good luck with it. Just be sure to compress data using 'xz --check=crc32' so it's compatible with Embedded XZ, and remember to initialize the CRC table.
As for a decompression strategy, you may need to segment the data in such a way that you can decompress different parts of it on demand (i.e., a tree of databases). I'm not familiar with your data's characteristics. Will a user need it all loaded at the same time? Or can it easily be compartmentalized?
Also, XZ can be a bit slow, even to decode. Have you evaluated how well regular gzip performs? That tends to be A) very fast; and B) available as a standard part of all embedded and mobile frameworks.
Use protocol binary format (pbf) formely google and now open source. It is compact and very fast searchable, so no need to decompress it on a device and load it into a database on that device because pbf acts as a database. Just include pbf library in your code to query it. Of course you have to compress it once before distributing the data online.
Working on a game that would like to allow users to select different character skins using PlayN (targeting Android as first platform). But these character skins will be made available later on (and due to their size, player may not want to download ALL skins). So rather than create a large bundle with all skins, is there a way for PlayN to access different resource files at runtime? We can setup a server backend for players to browse the latest available skins.
Any help/pointer is greatly appreciated.
You can use Assets.getRemoteImage to load images remotely.
There is currently no way to cache the downloaded images, which makes this non-optimal for your purposes, but I'm going to add support for caching the downloaded images which should make it work reasonably well if you use texture atlases and don't have a ton of images to download.
If you do need to download dozens or hundreds of images in one fell swoop, then you'll need to write custom per-backend code to handle that (and you'll have to give up using the HTML backend, because it cannot do things like download a zip file and unpack it into local storage).
I am writing a program to manipulate images,ie change its color,brightness,contrast etc...
The DVM doesn't support the manipulation of images of size beyond a limit...Can any one tell me whether using Open CV will solve the issue(as this seems to be a better option than NDK)?
Or will I have to use NDK?
I have done a lot of search and was not able to find answer..
First of all, there are different options for Image processing in Android, see here for a comparison of the most popular options: see Android Computer Vision JavaCV OpenCV FastCV comparison and Image processing library for Android and Java
Coming back to your question: If the images you deal are really very large so that they do not fit into the memory of the device, you need to process the images in small chunks called tiles.
If your images are not that big, I recommend you to use OpenCv, if you have to do anything more than very simple tasks such as brightness/contrast adjustment.