wood burning stoves 2.0*
The moose likes JDBC and the fly likes Jenny and bind variables Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Databases » JDBC
Bookmark "Jenny and bind variables" Watch "Jenny and bind variables" New topic
Author

Jenny and bind variables

paul wheaton
Trailboss

Joined: Dec 14, 1998
Posts: 20493
    ∞

Currently, Jenny uses bind variables only for the fields that the developer specifies that is what they want.
It has been suggested that I should make the default be to use bind variables.
What do you guys think?


permaculture Wood Burning Stoves 2.0 - 4-DVD set
Stefan Wagner
Ranch Hand

Joined: Jun 02, 2003
Posts: 1923

Who or what is Jenny?


http://home.arcor.de/hirnstrom/bewerbung
Jeanne Boyarsky
internet detective
Marshal

Joined: May 26, 2003
Posts: 30085
    
149

Stefan,
Jenny is a db code generator.
Paul, What would be the advantage to using it always by default?


[Blog] [JavaRanch FAQ] [How To Ask Questions The Smart Way] [Book Promos]
Blogging on Certs: SCEA Part 1, Part 2 & 3, Core Spring 3, OCAJP, OCPJP beta, TOGAF part 1 and part 2
paul wheaton
Trailboss

Joined: Dec 14, 1998
Posts: 20493
    ∞

I've been told that if you use bind for everything, Oracle can do some optimizations that it cannot do otherwise. An interesting point. I cannot come up with a substantial reason to not do otherwise (other than the fact that it's more 'normal' SQL - possibly easier to read?)
Rudy Dakota
Ranch Hand

Joined: Jul 27, 2002
Posts: 54
Hi guys,

Not based on any practical experience, but reasoning purely theoretical: using bind variables would make it impossible to take the detailed distribution of values over a given column into account when making the execution plan. The latter being one of the main advantages claimed for Oracle's Cost Based Optimizer.

I think you 'll find an advantage in using bind variables in the lion's share of cases anyway. It 's high on my list of interesting point to investigate without a business justification, nevertheless.

Good riding,

Rudy.
Loren Rosen
Ranch Hand

Joined: Feb 12, 2003
Posts: 156
I suppose if you've got very unevenly distributed data, there could be some advantage to making a new query plan every time. But more usually there's greater gain to be had by re-using old query plans, and you get more re-use if you use bind variables.
Carl Trusiak
Sheriff

Joined: Jun 13, 2000
Posts: 3340
I have heard the hupla about the performance of PreparedStatements over Statements but, have never seen any benchmarks showing the performance enhancements. If someone has documents they can point me at, it would be appreciated. My own limited tests against an Oracle database showed virtually no difference in the times between the two.

However, having said that, PreparedStatements make some things easier, such as removing the need to escape special characters in Varchar(2) data types, no need to convert Dates and TimeStamps to Strings and use a Database fucntion to convert it in the sql, the ability to insert and update Binary data easily.

Reading PreparedStatements raw can be tricky, it is easy to have the ? run together. However if the PreparedStatement is logged after all the bind variables are set, it is fairly easy to read depending on the JDBC Driver. Oracles puts [<position number>, <value>] beside each ?


I Hope This Helps
Carl Trusiak, SCJP2, SCWCD
Bear Bibeault
Author and ninkuma
Marshal

Joined: Jan 10, 2002
Posts: 60774
    
  65

I use PreparedStatement almost exclusively, and always when there are parameters to bind.


[Asking smart questions] [Bear's FrontMan] [About Bear] [Books by Bear]
Mark Spritzler
ranger
Sheriff

Joined: Feb 05, 2001
Posts: 17249
    
    6

Ht ereason why you should use bind variables is because they will be faster. Here is why.

When Oracle receives a query, it looks in the Library Cache in the SGA, which is the in-residence memory. It tries to find and exact copy, if it does than it can use the already made up explain plan. If it does not find and exact copy, it will have to parse the query and create an explain plan, which takes a small amount of time, this is called a hard parse. An Exact copy means exact, even in the count of spaces.

By using Bind variables, Oracle can keep that in memory and use the explain plan everytime. So whatever values you use in the bind variables, it will also use that explain plan without having to do a hard parse.

In Thomas Kyte's books "Expert one-on-one" he specifically states that you should use bind variables all the time. Mr. Kyte is a VP at Oracle, and the answer guy at Oracle's Ask Tom website, which is a must "Favorite" in any database/query person using Oracle. He has all the answers.

Good luck

Mark


Perfect World Programming, LLC - Two Laptop Bag - Tube Organizer
How to Ask Questions the Smart Way FAQ
Chris Mathews
Ranch Hand

Joined: Jul 18, 2001
Posts: 2712
From a developer perspective PreparedStatements are much nicer and make portability between databases that much easier since the JDBC driver takes care of character escaping, date formatting, and other such mundane details. So even ignoring performance characteristics, it makes a whole lot of sense to go with PreparedStatements. Like Bear, I use them exclusively at those times that have to drop down and write JDBC code (which occurs less and less frequently).
Tim Holloway
Saloon Keeper

Joined: Jun 25, 2001
Posts: 15952
    
  19

How much performance boost you get from PreparedStatements depends on the DBMS and driver. I've seen some systems where all the PreparedStatement handler did was convert the incoming data into a plain old SQL internally, but that the other extreme, you might get all sorts of optimizations. Mileage may vary depending on driver (urp!) and road conditions.

Like Mark said, the extra portability and elimination of the need to throw together Yet Another Set of SQL-Escape Utilities is worth something in and of itself.

Of course, the more intense the optimization is, the move overhead setting up a PreparedStatement can be. I've seen a particular problem in webapps, where I'd prepare a statement, use it in a request, then have to throw it away because I returned the connection it was based on back to the pool.

Lately, I've been playing with JDO. One of the neat things about the Kodo implementation (hey, do I get "plug money" for that? ) is that they CACHE their prepared statements and the connections are managed external to the in-and-out-flow of the web requests, which can potentially make things flow more efficiently.

Bottom line: Do your homework and see which tools match your project's needs. And, of course, keep things abstract in case you need to tune later!


Customer surveys are for companies who didn't pay proper attention to begin with.
Chris Mathews
Ranch Hand

Joined: Jul 18, 2001
Posts: 2712
Originally posted by Tim Holloway:
Lately, I've been playing with JDO. One of the neat things about the Kodo implementation (hey, do I get "plug money" for that? ) is that they CACHE their prepared statements and the connections are managed external to the in-and-out-flow of the web requests, which can potentially make things flow more efficiently.

Actually, many Application Servers and other Connection Pooling products do this as well. See this thread for a recent discussion.
paul wheaton
Trailboss

Joined: Dec 14, 1998
Posts: 20493
    ∞

What about when things go wrong? An exception is thrown. Currently, the SQL is dumped and you can try the SQL in a SQL client.

Could this be a case of oracle optimization (and do you really gain that much here? It seems that the lookup and the network call would far outweigh whatever you gained here) vs. error handling readability?
Jim Yingst
Wanderer
Sheriff

Joined: Jan 30, 2000
Posts: 18671
What about when things go wrong? An exception is thrown. Currently, the SQL is dumped and you can try the SQL in a SQL client.

Naked PreparedStatements are generally harder to debug. JavaWorld addressed this problem in an article awhile ago, wrapping the PreparedStatement in a decorator to provide additional debugging info. Seems like their approach could be applied in Jenny.

Could this be a case of oracle optimization (and do you really gain that much here? It seems that the lookup and the network call would far outweigh whatever you gained here) vs. error handling readability?

I don't really know (and would be interested in hearing from anyone who does know). But I'd think that the lookup at least could be a relatively simple hash table lookup. The network call seems the more significant factor here. So I guess the question is, how long does it take to build a query plan (compared to network overhead)? Dunno.
[ May 27, 2004: Message edited by: Jim Yingst ]

"I'm not back." - Bill Harding, Twister
Chris Mathews
Ranch Hand

Joined: Jul 18, 2001
Posts: 2712
Originally posted by Jim Yingst:
Naked PreparedStatements are generally harder to debug. JavaWorld addressed this problem in an article awhile ago, wrapping the PreparedStatement in a decorator to provide additional debugging info. Seems like their approach could be applied in Jenny.

I actually implemented this using Dynamic Proxies (so it would work with any version of JDBC) if anyone is interested... maybe something we could add to our JavaRanch class library.
Marilyn de Queiroz
Sheriff

Joined: Jul 22, 2000
Posts: 9044
    
  10
Is your stuff available, Chris?


JavaBeginnersFaq
"Yesterday is history, tomorrow is a mystery, and today is a gift; that's why they call it the present." Eleanor Roosevelt
Dave Churchville
Greenhorn

Joined: Jun 07, 2004
Posts: 23
Pros and cons of using "bind variables" (PreparedStatements in JDBC):

When executing a PreparedStatement, drivers will usually "prepare" the statement itself first (either by making a server call, or within the driver), then pass just the data in subsequent calls, already encoded in the proper datatype. Normal SQL statements must be repeatedly parsed, and the data types converted on each call.

So if the same call is going to be made repreatedly, the PrepardeStatement should be faster.

In practice when using connection pooling you can't actually reuse a prepared statement, since you're not keeping connections open, etc. for more than one request. However, in app server containers (Weblogic, for example) there are options for caching prepared statements at the container level, so this adds back the performance gains.

If for no other reason, as stated above, it does free you from all of the syntactic issues associated with creating valid SQL, escaping quotes, etc.

It's a good investment, but you will need to account for database and driver differences. For example, if you're trying to bind decimal values, bits, booleans, etc, Oracle drivers might behave slightly diffrently to SQL Server drivers. Unfortunately that's the complexity that you were trying to avoid in the first place (or at least hide from developers).

This is why some people are using Hibernate, Toplink, EJB CMP and other ORM solutions, because they have already accounted for variations in SQL syntax, datatypes, etc.

So my advice is to try it out on a few databases before commiting to the internal implementation choice. Watch out for "odd" datatypes like big decimal, byte, bit, and boolean. I assume BLOB support is out of scope for Jenny.
 
I agree. Here's the link: http://aspose.com/file-tools
 
subject: Jenny and bind variables
 
Similar Threads
Starting when tld files don't need to register in web.xml?
unicode
MS SQL server bind variables
How to get header?
lots of "extends" ??