Most lightweight method of data for Android-consumed web service - android

I'm working on an Android app that consumes a web service that has the potential to return quite a lot of data. The web service is also goingto be built by my team, and we're looking into ways of transferring data to the client with the least amount of load. We've looked into REST and SOAP, and can't decide between the two. Would JSON be a good alternative? We might need to fetch quite a lot of data in some use cases.
We'd really rather not use SOAP, if can avoid it. The emphasis is on reducing the work needed to be done on the Android system as much as possible.
Could anybody please help us out? As far as technology scope is concerned, we have full freedom of choice as long as we stay within the JAVA umbrella.
Thanks in advance.
EDIT: Not sure how to add comments, so I'll do it here(I read the FAQs, but couldn't find the necessary section)
Hi,
Let me start by apologising for my vague problem statement, I was actually a bit rushed when I posted the query. Anyway, to answer your questions, by "a lot of data", I mean that there are going to be ba few large requests(2-3, depending on the use case) that MIGHT return a lot of data, e.g. if I search using the entity name(complete or partial), the result, which is going to be textual data only, would be considerably less than if the serach was made with country as a search parameter, in which case it might return several hundred separate results. One particular search parameter has the potential to return a few thousand rows of data. What we can control is the amount of data in each row(so to speak) returned. Currently, we're attempting to reduce the items to display per row(we're attempting to keep it below 5). It's a business decision, I'm afraid, so if we're stuck with the current scenario of 12 items per row, we need to be prepared for that as well, I'm afraid.

What you just said was the equivalent of "i cannot decide between unleaded and 15" tyres".
Restful web services tend to be lighter than soap but there are use cases for both. They are both technologies used to deliver service orientated architecture. For you small description I would use rest although with more detail there may be a case for soap. You say a lot of data but does this mean many small requests or a few large ones?
Your restful web service may return a variety of data formats such as xml, yaml and json. Json's syntax makes it lighter than xml so that could be a good choice. Examples of this sort of setup include Facebook's open graph api and drupal's services module.

Related

What database should I use for a data visualization app?

I want to create an Android/iOS tablet app that will visualize data from a number of desktop apps that have the same function (facilitating orienteering events) but may be very different in their construction. The idea I have is that when anything changes in the desktop server database, the change is communicated to my tablet app.
Now, I don't know what would be a good form of communication between the server and the app (JSON?), but I think that before anyone would want to consider modifying their desktop app to be able to share data with mine, my app needs to actually work.
So I'm looking to write my first line of code, but I think before I do that, I need to decide on a database. In the tablet app, the user would only be performing read operations. The data itself would be small (short strings and some ints/longs) and structured and work well with a relational DB. Assuming the server communicates all updates immediately, there could be an update on average every 5 seconds for a normal event.
Considering how you described the problem, the data doesn't seem too complex or big; This is more of a "what do you prefer to work with" instead of "what do you need".
JSON in this case is a good format for your data, and if you like working with it, then that's a good solution.
If you want to use JSON format, MongoDB might be the best choice. Easy to learn, big community, pretty advanced and complete.

Expose data from same back end to multiple clients

I am a new guy to full stack web application development. I want to design a web application which has data stored in say back end databases. Now I want to design a desktop web client as well as android application which will be able to fetch data from back end. So how do I need to start? What APIs can be used or how can I expose data from same back end to multiple clients?
Also I want to handle massive amount of request. How to design such a system? What to use in back end to store data and handle requests efficiently.
Any video / document / reference containing useful information will be much appreciated.
Wow, you have a whole forest of questions to settle. You are going to need to go do your own research on such things as algorithms and data flow for your application before you can make any reasonable choice of platform. Here are a couple of basic ideas to get you going: 1) look at Java and Node.js. There are lots of other possible platforms but chances are you will end up using one of those two. Try to think about what the actual code you will generate in each of those will look like. A little or a lot? 2) Just store your data in files, most probably using JSON. Maybe you will end up doing something more fancy after you figure out where you are going with your project, but you will be surprised how well the simple file-based solution will scale.
When you have done a bunch more research, and maybe even coded up a few ideas on your platform of choice, then come back and massively edit your question. Only then will specific suggestions for tool choices be possible.

Mysql vs Parse.com

My app is growing very fast, and we have used Parse.com as the main database.
We want to create a web version of this to work together with the app for iOS and Android.
The only limitation that we know from Parse.com is that the queries max limit is 1000, so after it you need to create a function to do more queries.
So talking about a very large database, performance on server-side, Users doing queries using 3g network and scalability, is a good way move it to mysql?
Not considering time and cost.
Only talking about performance, scalability and queries to work with Android, iOS and web.
Thank you
In terms of scalability I guess you really need to worry about that if you are managing your own servers. Worse case, you go with a professional hosting service but use VPS.
Parse uses mongoDB which is entirely different than MySQL. The 1000 limit should really be an issue regardless of what database you use because you should be structuring your app that you don't grab anywhere near 1000 items at once. That is where lazy loading comes into play.
Parse is a REST API. This means you're using HTTP requests to get information. You will need to write an API or find an open source option to put on your server. You need to make sure that your code runs efficiently so you can get results fast and put as little strain as possible on your servers.
Another thing is to try and prevent having 1000 queries. If there's a way where you can download data that can be reused in the app instead of re-downloading it each time then do it. Less strain on servers, quick response, and little data going back and forth.
That's really all I have to offer.
Not sure if this would help you, but I'm moving stuff out as well, but you are comparing two different databases to each other. I'm moving my data to Cassandra or MongoDB, that's because the data is key value. As for mysql, I hope you keep in mind, a lot of companies use it for large scale problems and they run it fine. I hope this helps.

Store big JSON in Titanium for iOS and android

How the app works
Currently an app is in the works which utility is to explore activities in 5 regions. Each activity is represented as an JS object with a fair amount of properties. Activities can be viewed through different filters in their respective tab, for example categories or a map. Inside each main filter, there are options to filter on date, region, accessibility etc.
The challenge
There is a lot of JSON that needs to be stored on the device, and support is required for both iOS and android.
In the best case scenario the data needs to be in sync with the database, and all data needs to be available on the device. The app will need to be snappy for a good experience, this means that fetching data needs to be as fast as possible. Furthermore, filtering data needs to be as snappy as possible.
Viable solutions considered so far (which don't quite cut it yet)
MongloDB with the MongloDB Titanium Store adapter, silver bullet?
This approach at first seemed be the silver bullet. Although the project seems promising, it is maintained by one heroic hacker, and the project is in need of some documentation. I have inspected the source, and hacked my way through the API, but to no avail, console.log and jasmine tests won't cut it this time. More important still, it is not quite finished yet, and features compared to MongoDB are missing. A great project, I hope it will catch on more and be capable enough to assist desperate titanium developers in the future.
JSONDB, only for iOS
This app really needs to work on both platforms, iOS and android, so no reason for trying this. Moreover, JSONDB works within a single context only, which would be a serious concern as well.
Ti Filesystem and JSON.stringify + JSON.parse, not memory efficient
A viable solution for saving a small list of saved items, which is also a feature in the app. But in other posts issues over memory limits with the use of JSON methods have been noticed for android. Though this might not be the least of my problem, memory efficiency overal will be a huge problem. Never have I seen benchmarks for performance with file reading and writing for Titanium, so I am not sure how big of an impact reading and writing would be. Filtering big objects is a huge concern as well, underscore won't manage this kind of big data. Iterating big objects is a huge problem no matter what approach I will choose.
Big ass global object
Practically the same approach as a Filesystem, only keep it in a global. This has the same issues and is just a plain unethical practice.
SQLite, yuck
Highly document oriented JSON data to SQlite, it sounds worse than samsung galaxy fanboys. Any feedback on this?
Multiple files + SQLite to maintain + lazy load, unicorns and rainbows?
Desperate for a solution, I might be onto something in the course of writing this post. There are probably something 10-16 main categories which each 1 to 4 subcategories. Keep all the activities for a subcategory in it's own file, which is a quite slim JSON. Browsing through categories, each subcategory is rendered in it's own TableViewSection, each subcategory be appended independently to the table based on how much the user is scrolled down, effectively lazy loading the content. There is only one quite quick file read. Within this view adding more subfilters effects only the already loaded items, and iterating this items is reasonably affordable.
Updating the data is also quite effective, only files that are subject to change are updated. A SQLite database can maintain the dates of all activities which have a expiry date, it can dynamically build it's own JSON file for the upcoming seven days or month. This will make the calendar view quite smooth for most usage. Picking future dates will be a nightmare though.
Still the map is an issue...
If you have read all of this, thank you. If you have experience with something similar, or might be onto something, feel free to reply! I have to quit writing, quit coding and start sleeping.
Sorry for the crappy monglodb docs. I developed it for some internal projects and really wanted to share it with the community, but the lack of docs does make it hard to use. But great news I have docs now lol also slimmed down and cleaned up the source code. Hope it works better for you now. http://monglodb.com
I'm the original author of JSONDB and thought I'd drop in and provide an answer for anyone finding this question via Googlefu.
JSONDB is now deprecated software - it's been replaced by another project called SculeJS. SculeJS aims to provide a full featured NoSQL database written in pure JavaScript for use in Titanium, NodeJS, and web apps.
JSONDB was originally only available for use in iOS applications due to limitations in the way Ti native modules were built - the current versions of JSONDB and SculeJS are compatible with both iOS and Android apps.
In a lot of ways MongloDB and SculeJS are similar, where they diverge is in the way SculeJS has been engineered. SculeJS is intended more to provide powerful, generic data structures with a rich query layer rather than being a straight port of MongoDB. No insult to Monglo - it looks like great software, I just wanted to point out the difference in intent between the two projects.
As a side note - all pure JavaScript modules are limited to to a single execution context within Titanium applications.
For what you're building I think MongloDB, JSONDB, SculeJS and TaffyDB would all do the job, the details of the implementation would just be slightly different.
I was encountering the same problem. I had about 5mb's of data which I wanted to store with the app, and not let it download.
I finally ended up with an SQLite database, with high performance. It is not as bad as you think. It might not be a nice solution ,but for the lack of choice it is a very good one IMHO.
Just create a couple of tables, and functions to parse them to database, and the other way around, and I promise you, you will be happy.
DO NOT store the JSON in the database, but store the values appropriately.

Android, preferred method for accessing 10,000+ record database on server

So I am in need of some assistance in trying to determine what I am going to need in order to accomplish a task.
Plain and simple...I am looking at accessing multiple databases some of which may contain over 10,000 records via Android. From what I have seen web services that return JSON is the way to go for something of this nature, but I don't think that fully answers my question or know if this is the preferred way to go about this.
Digging a bit deeper...I have a few apps on the market now, but this will be my first attempt at an enterprise style app, and I have accessed public web services with a lot smaller footprint than what this is going to be. I have little to no experience within the realm of server/network administration which is where I am getting tripped up. This is from the ground up and I have to ability to obtain almost any resources I need to complete this task.
It appears that there is a SQL Server 2008 on the back end if that helps. If I need to provide further details let me know. I am looking at a solution that will handle organizational growth, scalability, authentication and ease of user...so keep that in mind too.
So what is the best practice/preferred method for doing an enterprise application with a substantial data set? What are the big dogs doing, and how? Both on the client side and server side. I am trying not to "screw the pooch" out of the gates on this, and this is one of those measure twice and cut once situations which is why I am trying to garner plenty of input and assistance.
Thanks in advance!
If you don't have an API/service yet, you need to write one on top of your database.
I can think of two approaches, depending upon your use case.
Paging: Setup an API that supports paging, and show the results page by page. The user can't possibly view 10000 records in one go.
Search and suggest: Try creating a suggestion list, when the user starts typing out something. Fetch results that start with the initial characters entered. However, the API should limit the results to a comfortable number, so that you don't have to parse a lot.
Depending on your use case, you could try one of these.

Categories

Resources