This week's book giveaway is in the OCAJP 8 forum. We're giving away four copies of OCA Java SE 8 Programmer I Study Guide and have Edward Finegan & Robert Liguori on-line! See this thread for details.
What way do you guys think would be the best to architect this j2ee application? Here is the requirement: Build a web app that allows a user to upload a file containing a list of names to import into a database. There is significant (takes a while) processing and validation that needs to be performed on the file, which can be fairly large as well. The goal is to process the file in a separate process and show the progress on the page. Eventually, it will return results (ex. what items didn't import and why, etc) back to the user on the page. The idea is to allow this user to kick off the process and not have to wait for the next page to resolve before the process is done... it could take 10 minutes to complete, so the app will display a status/progress indicator on the webpage. The user should be able to surf the rest of the site while the app is processing the file. The user can access the status/progress page to see how far along it is or to see it if is done. How could j2ee be leveraged in this scenario? Currently, the app accepts the file from the user and immediately kicks off a Thread that processes it. The user is forwarded to a progress page that shows the percentage complete. The Thread object constantly persists its status/progress to the database. The status/progress page queries this same data to show the progress to the user. The app works fine the way it is implemented now and meets the requirements, but as I understant it, it is a risky design. The Thread is out of the control of the Application Server and will compete against the appserver for CPU cycles. This becomes even more risky if the app is allowed to kick off MANY independent threads... the threads can use up all of the cycles, thus creating performance problems with the app server. I suspect Message Driven Beans and JMS could be used in this scenario. The app server would then be in control of the thread management, thus gauranteeing stable, scalable performance? If true, what would the design look like? What are some other options? Thanks!!
Competition is good. Seriously, since you can't save cycles in a jar for a rainy day, you generally want to get the maxiumum benefit from your hardware. After all, hardware costs money. Your OS contains load-balancing tools designed to make processes play nicely with each other in order to provide acceptible service levels. These vary from simple priority-adjusters (e.g. the Unix "nice" program) to complex dynamic systems such as those used on IBM's zOS. As long as you're not being sloppy (coding loops to do timing delays, for example), you can rely on these mechanisms for the majority of purposes. They're designed to balance the workload across the whole system, and therefor usually know more about the other tasks that are running at a given instant. Of course, if you DO need some restraint, you can do something as simple as drive the backend engine from a work request queue. This engine might contain a tunable "max threads" parameter whic would ensure that no more than a certain specified number of worker threads were running, waking up whenever a thread terminated to pull another request off the queue. There's all sorts of amusing variations possible on this theme. I'd use JMS if requests were coming in via JMS channels to begin with. For servlet/JSP requests, I'd not go to that much work/overhead.
An IDE is no substitute for an Intelligent Developer.
I'd use JMS if requests were coming in via JMS channels to begin with. For servlet/JSP requests, I'd not go to that much work/overhead.
Could you elaborate on this? I would write the work requests into a JMS queue even if the data comes over HTTP to a servlet. The reason being that you'll get persistence for the data to process and you can use easily configurable MDBs while letting the application server manage the resources used for the actual number crunching (instead of relying on "luck", even if the chance of a threading issue is close to negligible).