I have a seam application that generates some big queries. I have simplified it down to 7 tables and am still having a problem. I have 7 tables, 2 of which are mapping tables to handle the many-to-many relationships. One table is particularly large. The WHERE criteria can be from any of the 5 data tables (not mapping table) or any combination thereof.
I have been trying for awhile to figure out how best to construct the queries I need. The easiest thing would be to use seam's entityquery framework with restrictions. Like this excerpt:
This actually works very well for smaller data sets. I haven't figured out the threshold, but it works on a dataset of 25,228 (all the data where state='AZ'). When I try it on a larger dataset (ie all the data where state='TX', a bigger state), the query just times out.
Originally when I wrote this app, I constructed a SQL query that selected only the id of the entity I wanted based on the search criteria and then grabbed entities that matched the id list. That actually worked very well, but the way I did it was very "hack"ish and contained a lot of duplicated code. I'm trying to improve it now, but still unsure how to do it. When I try to construct my own SQL, the algorithm for building joins between the various tables eludes me.
The easiest thing, as I said, would be to restrict the dataset using the EntityQuery restrictions. Is there any way to handle such a large data set so that I can use this framework?
And tomorrow is the circus! We can go to the circus! I love the circus! We can take this tiny ad: