We are looking to tune our SQL queries. One suggestion that came from on of our developers is to replace multiple small queries and merge them to optimize them since the queries take a large time to cross network and reach database. When the queries are merged they become hard to understand for an average person and understanding them is very hard. What should be the strategy related to running one large query vs running multiple small queries. Can anybody give out pointers on how to write efficient queries and what consititutes a good query.
1. Consider using stored procedures if the multiple queries are part of a single transaction. 2. if you are calling the same statement only with different parameters more than 65 times, use prepared statments, else use statements unless you need to set in parameters (then use prepared statments in all cases). 3. retrieving a large number of fields should not scare anyone, even novices. Loops and Hashtables and other collections are your friends. They can help you make your code a lot cleaner.
More importantly, thought, you need to analyze your database design. If you are forced to make a large number queries during a single transaction, it is probably due to bad design. Although you might be using a relational database, you can model it in a pseudo object oriented fashion (storing and retrieving collections and objects).
SCJP, SCWCD, SCBCD, IBM CSD WebSphere v5, <br />A+, MCP 2000 and 2000 server, CST, and few incompleted certification tracks.<br /> <br />Ivory Coast<br /> <br />Analyze your web Request/Response @ <a href="http://webtools.servehttp.com" target="_blank" rel="nofollow">http://webtools.servehttp.com</a> down for a while...
Originally posted by Syed Hussain: One suggestion that came from on of our developers is to replace multiple small queries and merge them to optimize them since the queries take a large time to cross network and reach database.
How do you know his?
The soul is dyed the color of its thoughts. Think only on those things that are in line with your principles and can bear the light of day. The content of your character is your choice. Day by day, what you do is who you become. Your integrity is your destiny - it is the light that guides your way. - Heraclitus
Joined: Feb 06, 2001
Originally posted by Brahim Bakayoko: 1. Consider using stored procedures if the multiple queries are part of a single transaction.
Braham, thanks for your suggestions, especially one regarding replacing very large queries with Stored Procedures. I`ll check to see if we can do this.
Originally posted by Ilja Preuss:
How do you know his?
Ila, we do not have any process on evaluating how much time the application spends in business logic and how much it spends on database logic. So this statement is only an impression since retrieving data from database is generally the most expensive operation. Thanks for your help.
Joined: Jul 11, 2001
Originally posted by Syed Hussain: Ila, we do not have any process on evaluating how much time the application spends in business logic and how much it spends on database logic. So this statement is only an impression since retrieving data from database is generally the most expensive operation.
Then it could also be that not the network transfer is your problem, but the actual execution of the statements. You could try to use IronEye Sql to analyze where your performance problem really is coming from...
Syed, I agree with Ilja that the first place to look is the queries themselves. We do happen to have a problem with network traffic. The problem isn't tied so much to the number of queries as it is to the amount of data returned. If this winds up being your problem too, make sure that you are not returning any unused columns (This is probably a good thing to do anyway) and that you are not returning any columns that can be determined from the input parameters.