Meaningless Drivel is fun!
The moose likes Servlets and the fly likes Handling Unexpected Double Requests? Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Java » Servlets
Bookmark "Handling Unexpected Double Requests?" Watch "Handling Unexpected Double Requests?" New topic

Handling Unexpected Double Requests?

Andreas Schildbach
Ranch Hand

Joined: Jan 22, 2003
Posts: 34
Hello everyone,
sooner or later every interactive web site has to deal with the problem of unexpected/unwanted double requests. These double (or triple, ...) requests occur when a user performs a Windows-style double click on a link or a submit button. They can also be caused by network problems and browsers/proxies that repeat the last request after a given timeout.
The problem with double requests is when they execute an action that changes the persistant state of the application. An action to delete a mail from a folder will delete the mail on the first request and then throw an exception on the second request because the mail can't be found any more.
One common solution to this is to deal with it in the business logic: If you try to delete an non-existing object, just silently return and do not throw an exception. On the other hand, the business logic tier should not know about the shortcomings of the transport protocol (presentation tier).
Is there any "best practices" solution?
I am thinking about implementing a servlet filter that allows the following: Each action that is known to cause problems needs to be appended by a special parameter that should probably be pseudo-random, but could also be a sequenced number. The parameter needs to be embedded in the href attribute of a link or as a hidden field in a form. The filter is executed very early in the filter chain, detects the parameter and blocks all other requests with the same random id until the first one returns. All blocked requests will not be delegated down the filter chain but will return with a copy of the first request's response. Additionaly, the response is associated with the random id in a hashtable for at least 5 minutes, and each late arriving request with an existing id also returns the pre-generated response.
What do you think about it?
Has this already been implemented somewhere?
Jeffrey Spaulding
Ranch Hand

Joined: Jan 15, 2004
Posts: 149
We had this application with all these impatient users and a slow connection.
Can you believe there are people who manage to click a link 5 times a second (Proven by Logfile entries - dreadful)
We didn't want to handle the subject on the server side. Since all our links use javascript anyway we changed the JS function-call that's called by the link.
The function that's called by the link changes the link's href to "#".
So only the first click fires, all subsequent click don't do any harm.
Works like a charm since then
I agree. Here's the link:
subject: Handling Unexpected Double Requests?
jQuery in Action, 3rd edition