We are trying to implement a functionality which requires about 15 Mb of sqlite database (about 115,000) rows to be downloaded and displayed on a screen. We are zipping the database in gzip and downloading the data. But the download takes a lot of time on a 3G network. Also, the queries run a lot slower on the database and it takes a lot of time to fetch even small chunk of data and display on screen.
The solution we are thinking of is splitting the database into multiple chunks.
Any suggestions on how can we improve the performance? Also, any idea on what is the ideal size of database that we can split into to get a fast query performance.
The queries you run for sure have WHERE clauses. Make sure that the columns used in the conditions are indexed. This will hugely boost the performance. As for the large size, why don't you put the database (or a part of it) in the application binary?
Thanks for the indexing suggestion. The database would be dynamic based on certain conditions. Hence I cannot bundle it in binary.
Would splitting up the database into multiple logical databases help the cause?
I don't believe that splitting the data into multiple databases would help query performance a lot. From my experience, correct indexing makes the highest performance boost. However, I was thinking of splitting the data into chunks and giving the user the option to download each chunk separately. This might be OK, but it totally depends on your business. On more thing that could decrease the download size is not to download the sqlite database itself, but rather download the gzipped raw data, then parse it and insert it in the database when download it. Finally, try to vacuum your database, and see if the size decreases.
Oh, I forgot to mention this. If you are going to index the database, with this large size, you'd better do that on the device, and not in the sqlite database that's downloaded, since the index will take some extra size. Another thing, take care that each index you add will increase the time needed to insert/update/delete the record a little (not much) but will hugely increase query performance if you are using the column you are indexing in WHERE clauses. So my advice is, index the right columns, and do that on the device after the data is downloaded.
I would like to share what I have done for Improving Database operation!!!
I am downloading bunch of records from server , nearly 30,000. While inserting same records in database it was taking around 2 minutes and 10 seconds on iPod touch (I believe for iPod for testing as when performance is a factor). You won't believe but when I used Begin Transaction and Commit Transaction before database operation (for inserting rows in DB) data inserted in only 4-5 seconds of time.
I was able to save 40% time on whole operation.