I was wondering how can I perform following thing.
I have a web client application, (but NOT web browser), and I want to send a request from it, get it on the server, and some time in the future (1, 2, ... 8 hours later), to send some data to this client, which will be of course received by him.
I thought it will be easy, but it turns out, it is not. I tried some things with wait/notify mechanism, but unsuccessfully. Also, I tried to store HttpServetResponse of each web client, and later to use it but it is not good idea too. Can anyone give me some direction? If that is not solution, as daily servlets container exists, may I adapt Jetty to do such a thing, or something?
Any suggestion, or directive will be appreciated.
Thanks in advance!
What you are describing - a very long term connection - is completely contrary to the spirit and design of HTTP so it is time to back off and redesign.
1. Why don't you simply have the client "poll" occasionally to see if the data is ready?
2. Given that the client may not even be running when the result is ready, why not use a protocol designed for asynchronous communication:
a. email - send data in body or as attachment
b. Java Message Service
the reason I do not want to perform "polling" is because I want to decrease http communication from my client, as much as it possible, if there is no reason for that (the data is not ready, for example).
Email is not a solution, and about JMS, I am not sure it will working, since I need again, that HttpServleResponse might be stay idle for a unknown amount of time.
The business logic I tried to implement is to store HttpServletResponse,when request arrived into servlet (server), into any storage place (let samy some memory place like HashMap), and after 1h, i wanted to invoke OutputStream on it, and send some data to the client, which Http connection is never closed...That is the idea, at least...
Do you maybe think, that some modification of Jetty (I believe it has an open source code), might do such a thing?
I recently ran into this same problem for a video conversion use case, and finally solved the problem with help from an expert right here on coderanch.
Video conversion is a long running operation, and I had the problem that client side wouldn't receive any response at all because the session had timed out.
Worse, the client wouldn't even get an error condition - Tomcat silently dropped the connection, based on socket timeout because there was no response from server.
I tried all the usual configuration methods like increasing timeout values, session timeouts, etc etc. But they were never reliable solutions - there could always be a video which takes more time to convert.
I then shifted approach to handling this explicitly in code, and had these possible solutions:
1. When processing is going on, keep sending some dummy response text till the processing is complete, just to keep the connection alive. This was ok in my case because it was an ajax request and response expected was text. So, I could send dummy text, then send a delimiter, then the actual reponse. The client would strip off the dummy text, then use the actual response.
While this did solve the problem, it felt "wrong", so I got rid of this solution... Seeing your processing time ranges (2 hrs, 8 hrs, etc!), it feels even more wrong.
2. I thought about using the reverse ajax (Comet) capability of Tomcat. This is one possibility for you and I've read even some high traffic sites like Quora use this (but it's not Tomcat - they've a complex architecture involving nginx and stuff like that for scalability). In my case, this involved a high learning curve because I've never used Comet - so I dropped this. Also, in hindsight, it seems a good idea for a second reason : there was going to be just 1 response - done or not done - and it seemed a waste to keep connections open for just 1 response, be it NIO connection or whatever. The user is intelligent and understands it may take time and he/she may have to login later to check if it's done.
3. The solution I finally used was good old polling. On the server side, I implemented a kind of job manager that provides a token (just a number) to the client and then starts off a job in the job queue. The client then polls every minute with that token and receives a simple 1 byte status. The client keeps the token in a cookie, and since my app also involved user logins, I could keep job statuses in DB in case client logged in later. Though it involved some quite a bit of redesign work, I like this solution because it's a clean design, does not reduce availability to other users and is scalable without any fuss.
Since yours is a web client app (desktop or mobile), so you don't even have the problem of cookies and things like that.
After this exercise came the realization: HTTP is simply not the right protocol for long tenure communication.
William Brogden's advice "What you are describing - a very long term connection - is completely contrary to the spirit and design of HTTP so it is time to back off and redesign. " is really true and I wish I'd got this advice before starting with my application. Hope you won't repeat my mistake
Joined: Sep 26, 2008
Hi Bear and Karthik, that was my idea, and yes, my client is a mobile client, and because of that, I do not want to do pooling, since it would be too much frequent!
The main problem is, how to send some data to mobile device from server, after long time job, performed on server?
I was thinking maybe some not-hard changes of light servlet container like Jetty, would accomplish my task. And what about Comet, you mentioned? Did you actually succeed with that implementation?
Joined: Apr 04, 2009
I never implemented the comet way. Just considered it at the time and then dropped, for reasons mentioned earlier. Comet is a capability and lot of web servers support it - google tells me jetty also does.
If you don't mind, can you tell us the nature of your mobile app? What does it do roughly - does it require new data constantly (like google maps) or is it just a processing status query?
Is a user really likely to keep open an app active on their battery powered phone for hours, waiting?
If processing takes 2-8 hours, then I feel even a wide polling interval like 15mins is good enough, and additionally a button for users to fetch and refresh status will satisfy the impatient ones.
I believe a connection kept open from a mobile client is charged even if there's no data transfer, because the carrier has to keep the AP open. Your users may not appreciate this much.