• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

Possibility of using a zip streams to resolve the problem.

 
Greenhorn
Posts: 12
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hello Experts and Friends,

I have a problem with in my application now:
The application has a feature of downloading the data(Based on the filter criteria in JSF page)as an XML file to client system, the devloper used xstream library for doing that.

The dynamic sql which generates has a problem with more number of results which resulting timeouts, to avoid that initially they made a limit of 100 results which does not give any problem.

But now the data grown in the application and the end user has to see the complete data.

In this case, is it a option choosing zip streams for to get the complete data into XML file.

Ex: Getting the result of first 100 records into an Zip Stream - XML file, and after that next 100 records and append it to the same file till the end of the result.

Quick reply on this is really appreciated. Please post if you have similar kind of code.

Thanks a lot for your help in advance!!!
 
Marshal
Posts: 28193
95
Eclipse IDE Firefox Browser MySQL Database
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
If your server is taking too long before it returns any results, then compressing those results isn't going to stop the client from timing out.

And you can't send partial XML documents either, so the pagination idea is no good. Parsers can only deal with complete XML documents.

So, your best bet is to fix the server so it doesn't take so long to compute the results. Or to change the system so that it isn't asked to compute such large result sets.
 
reachme guru
Greenhorn
Posts: 12
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Paul,

Thanks for your quick reply and descriptive answer and suggestions.

But however, since i am using xstream library to serialize the object into XML file, can't i go for a streaming parsers ex: xpp3 for handling the partial XML files using pagination idea.

Please give me your view!!!
 
Paul Clapham
Marshal
Posts: 28193
95
Eclipse IDE Firefox Browser MySQL Database
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Seems to me your main problem is that by the time the SQL has finished, the client has already timed out. Is that right?

If so, then nothing you do after the SQL has finished will make any difference. Compressing the data, paginating it, none of those will help. The only thing that will help is to reduce the time the SQL takes to run.
 
reachme guru
Greenhorn
Posts: 12
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Currently the SQL output is limited to the first 100 rows. So no timeout there.
So what i wanted to do is:

min=0;
max=100;

While (condition)
{

SQL between min to max
min=max;
max=max+100;
append the output to XML till the end of the result.
}
[ November 16, 2008: Message edited by: reachme guru ]
 
Looky! I'm being abducted by space aliens! Me and this tiny ad!
a bit of art, as a gift, that will fit in a stocking
https://gardener-gift.com
reply
    Bookmark Topic Watch Topic
  • New Topic