We have a web application that uses an applet to upload a file in chunks and transfers sequentially to the server and on the server side, a servlet receives the chunks. Now we are thinking of improving the overall performance. We are thinking of sending multiple chunks paralelly and servlet (some server side component) will assemble the chunks in right order (assume chunk order is passed) and creates the file the server side. So my question, is this approach feasible? If feasible, what are the issues can arise with this approach? Please throw your ideas. Appreciate your help.
Maybe I'm missing something but it seems to be pretty straight-forward. The individual threads, each uploading their particular chunk, could write their chunk onto the file system, and as each finishes checks to see if it's the last one and that all the other chunks have been written. That last thread would concatentate the files into one and remove the individual chiunks when done.
1. We need to maintain the chunk order so that file assembly is proper on the server side. What is best way to identify a chunk number? 2. What if a chunk transfer fails, we need to mark the whole transfer as failure and ask the end user to transfer again. 3. Performance, there is lot of IO involved, as each chunk written to disk and later read the chunk and write a big file.