singapore mobile application developer,mobile application developer singapore,ios app development singapore,mobile application development singapore,singapore website design,mobile apps singapore,developers in singapore,ruby on rails developer singapore,singapore web design services,web designer singapore,ios developer singapore,web design services singapore,web design company singapore,web development singapore,singapore web design,web application singapore,web development company singapore,singapore app developer,mobile game developer singapore,app developer singapore,website designer singapore,mobile app developer singapore,website design singapore,singapore web development,graphic designer in singapore,design agency singapore,website development singapore,mobile developer singapore,app development singapore,web design singapore,developer in singapore,android developer singapore,mobile app development singapore,design firms in singapore,singapore mobile app developer,mobile apps development singapore,website developer singapore

Performance Optimization

In last few months, we has been developing and iPhone / iPad apps that pull JSON data from an web API frequently. We used Ruby on Rails to do the first version for both website and API service. After trying the first version, customers complained about loading speed. They said that data loading on the mobile app was SLOW. There are two reasons that make Rails based API calls slow:
  1. Rails is heavy and blocking (after an API call we do some logging and system shouldn't have wait for logging to return data to clients)
  2. Need to join four SQL tables to query data, combine data then convert to JSON
Let optimize them!

Solutions for (1) are:
  • Rack Apps + Job Queues (Delayed Jobs, Resque …)
  • Non blocking server (Node.js, Goliath …) to fire an action to log info to database, forget it and return JSON to clients
Non blocking was chosen because it's fast, it handles more requests and help to avoid managing additional job queues.

For (2), de-normalize and pre-calculate JSON is the first step. But there is a table that cannot be de-normalized using SQL. It's listings table that state which item will be showed on which month and in which category. Let say item I will be showed in category C in month M1 and M2 then two rows (I, C, M1) and (I, C, M2) must exist in listings table. After de-normalization and JSON pre-calculation, only it need to join only two tables to get items give a category and a month then combine pre-calculated item JSON and return it to clients.

Second step for (2) is choosing database that is faster and more suitable to de-normalize and query data. Key-value stores are super fast but only support query data by key. Distributed / Map-Reduce (Riak, HBase ..) is overkill. In this case, MongoDB seem to be a perfect fit. MongoDB is very fast (memory mapping), flexible data structure and rich & fast queries (indexing). In MongoDB, listings can be stored in item document it self as an array of [category, month] pairs then can be indexed for fast querying.

Third step for (2) is minimizing data size for each item. MongoDB try to keep indexes and recently used data in memory so smaller data size mean less memory needed. Smaller data size also mean faster data transfer between database and app instances. To minimize data size I do following tricks:
  • Use short field name (cm instead of categories_months …)
  • Use short image name (/items/1.jpg instead of /items/this_is_the_best_image_for_item_1.jpg)
  • Zlib to compress pre-calculated JSON
  • Smaz to compress small strings
After lot of micro-optimization, item data size is reduced to one third (33%).

The final result is amazing. On the server side, API queries are 20 times faster and often take less than 10ms. Everybody is happy 🙂
%d bloggers like this:
WordPress