I have a website that includes a database that is made up of all the members and pertinent information for my organization. It gets updated throughout the day based upon people registering their dogs and clubs with our organization.
I am developing a offline application that is dependant on this information. I need to be able to pull the information from the website database and populate a local running MySQL database. (If it matters the website is running MySQL as well)
I envisioned running a SELECT for all the information that I need, doing a DROP and the current tables, and then INPUT all of the information from SELECT to the local database. My concern is that there will be a LOT of records to deal with. Something in the neighborhood of 2,000 records to start, and as time goes on it could get into the 10's of thousands of records.
Is there a more efficient way to go about doing this type of an operation?
Brett M. Williams<br />Co-Founder - United Flyball League International<br />Website Administrator<br />Software Developer<br />Database Admin<br />Anything else "techie"
Originally posted by Brett Williams: Is this more effecient than just pulling out specific portions of the database that I need?
Yes. You can usually configure the replicator to only copy certain tables or schemas. And the replicator is built to do this, so it does a faster job of it.