I'm trying to build a utility in which a user will select a file full of ID numbers. For each number, I need to do a lookup in a database to pull some relevant data. I wanted to build this in a web application because it's easy for my users to get to and utilize. The down-side of this is that, should the file the select be particularly long, the request may time out before any response is given. With that in mind, here are the steps of my "solution":
1. User select file for upload
2. User submits file - rather than doing a normal form submission I perform an ajax submission using jQuery and the ajaxSubmit() method (from the form plug-in)
3. On the server, the file is received and processed.
4. After each record is processed, percent completed and time remaining values are stored in the session
5. On the client, at regular intervals, I make an ajax call to get the progress values from the session
5a. The progress values received are used to update fields on the screen to let a user know the status of the submission
6. When the processing is complete, the final results are e-mailed to the user from the server
This whole strategy was devised to avoid having large requests time out. I've got most of this working, but the whole bit about updating the client display regarding the progress of the file processing is giving me trouble. Here's a snippet of the code I'm using:
Here's what I find...
If I go to the page and submit the file, the file is successfully uploaded and processed - I get the e-mail including the results when I'm done. However, the updateProgress method constantly gets back an object that has currentProgress and timeRemaining properties of 'null', as if they're not being set on the server. However, if, after I've submitted one file, I refresh the page and submit a second file, it works fine. Given that it doesn't work before a refresh but does work after one tells me that there's gotta be something funny going on with the request, but I really have no idea what it is.
Anyone have any insight into this one? I sure I hope I explained it properly.
Okay, so a couple of my co-workers helped me come up with a solution, but we're not entirely certain why it works.
I had been focused on the client-side code, assuming that the issue was there, but it actually turns out that the problem was in the server. The first request sent to the server was to process a file - this request could take quite some time. During that time, this process places updates into the session so that other threads can monitor the progress. Subsequent requests attempt to pull those attributes from the session and send them back to the client code. When I looked at the session ID's on the server side, I found that the first request was pointing to one session (we'll call it session 1) while the subsequent requests were pointing to a different session (session 2). As such, the values were being set in one session and I was trying to read them from another session, which obviously isn't going to work.
We don't completely understand what's happening, but our best guess is that the first request creates a session but, until that request is complete (which could take quite some time), that session is not completely created. That means that any other requests that come in looking for a session won't see an already-created session and will attempt to create their own. One particularly odd part of this is that, while the file processing is taking place, that thread points to session 1 while the update threads point to session 2 - as soon as the file processing is completed, session 2 is dropped and all further update requests go to session 1. The result is that, from the client side, it looks like you hang at 0% complete and, when the file is done, you jump to 100% complete. Subsequent file submissions work perfectly fine as, at this point, the session has been completely created.
So the solution...
I added an update request to the document.ready handler. This really just pings the server and tries to get variables from the session that I know won't exist. The benefit is that it forces creation of the session and, when I go on to submit a file for processing and request updates on that process, all requests now point to that single, fully created, session.
If anyone has anything to add to this, I'd love to know if this reasoning is correct, or not. It seems to fit this scenario, but I can't be sure how accurate we are.
I am developing a web based application in which I do need to process a long running time queries in one scenarios.
And i got server time out error.
I have seen that in your case you need to fetch data from files, and then search from database to perform operations accordingly.
Well In my case no file uploading etc is involved I just have to pass some variabales server has to just pick some values from one table and insert into other
but the issue is this that against one particular record i almost need to execute 20 to 25 queries, and i have to process round about 700 records so total roughly 700*20 queries load on server
Can you help me how can i improve this, especially Ajax or JQuery call.
I hope it is possible and i am not aware of to do this.