Win a copy of Clojure in Action this week in the Clojure forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

Fastest way of uploading large files into the server location.

 
Murali Adiraju
Greenhorn
Posts: 5
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi,

I am working on an application which has File Upload functionality, to upload files to the Server Location.

Am trying to find a fastest way of uploading files to the server location. File sizes can be upto 2GB.

I have used FileChannel API (transferTo() & transferFrom() methods)to perform file Uploading concept.

I am able to upload files upto 350MB size.

I have used "org.apache.myfaces.custom.fileupload.UploadedFile" instance to represent the File the user has selected.

Please suggest me an optimized way of uploading files of size upto 1GB to the server location.

Thanks..
AVM
 
Joe Ess
Bartender
Posts: 9214
9
Linux Mac OS X Windows
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Murali Adiraju wrote:Am trying to find a fastest way of uploading files to the server location.


Buy a network or internet connection with lots of throughput.

Murali Adiraju wrote:
I am able to upload files upto 350MB size.



What happens when you try to upload a larger file? I'm not aware of any limit other than the filesystem or something like setting the max upload size in the Apache Commons FileUpload library
 
Lester Burnham
Rancher
Posts: 1337
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Would FTP or a shared directory be an option? Both are likely to be faster than an HTTP file upload.
 
Tim Holloway
Saloon Keeper
Pie
Posts: 17627
39
Android Eclipse IDE Linux
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Lester Burnham wrote:Would FTP or a shared directory be an option? Both are likely to be faster than an HTTP file upload.


HTTP is a text-based transport (so is email). So binary objects have to be converted to text for transmission, which can double the actual upload byte count. FTP (or better yet) a LAN share would be much less overhead. Compression (ideally, in hardware) is a big help.

The Torrent system works by distributing data among many different computers running more or less in parallel, but it's not always a viable solution. It works very well for some things, however, and I don't just mean illegal copies of movies.

One of the biggest limitations is the pipe coming into the machine. While my LAN operates at 100MB, the Internet link is ISDN and that's a major choke point for me. I downloaded the latest Ubuntu image the other day and it took something like 6 hours to pull 700MB.
 
Vikrama Sanjeeva
Ranch Hand
Posts: 756
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Did you consider compressing file (zip) before uploading ? What platform this will be working ? Do you have security concerns over file transmission ? I think (S)FTP is normally the fastest method. Of course not http.
 
Murali Adiraju
Greenhorn
Posts: 5
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I tried using GZIPInput & Output Streams Java API to compress the files before uploading to the server location.
Still if the files are more than 400MB, it fails.
I am trying to use FTP to upload the files to the server location.
 
Joe Ess
Bartender
Posts: 9214
9
Linux Mac OS X Windows
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Murali Adiraju wrote:
Still if the files are more than 400MB, it fails.


What fails? ItDoesntWorkIsUseless
As I said before, the only file size limitation I've run into with file uploads is dictated by the server side framework. Have you checked your server logs? What is the client-side output?
 
I agree. Here's the link: http://aspose.com/file-tools
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic