Aside from the technical issues, though, this reminds me of a similar assignment I had way back in the earliest days of my career.
I was given the task of generating a multi-level report that would be run daily and emit about 2-1/2 boxes of IBM fanfold printout. That means 2.5 boxes of 2000-pages at 60 lines per page.
This was obviously nothing but a data dump, since no human being is going to read that much, period, to say nothing about daily. We accepted the assignment knowing full well that it was going to tie up the printer, destroy trees, consume ink, wear out ribbons and type chains and more often than not, get discarded unread. We didn't have Earth Day back then and it was all billable anyway.
There was little enough excuse for such things back then, when data tended to mostly be locked away on magnetic tapes, but these days it's considered to be more or less trivial to hack out a data query and drill down for the information you actually need, instead of wading through reams of eye-watering raw output.
In fact, the next level up when you're talking masses of data is Business Intelligence. The Pentaho BI suite is available in a fully-functional free Community Edition, and its report writer is pretty good (in fact, they used to use Jasper). Plus it can do OLAP stuff and other things that are not only impressive buzzwords but actually very powerful business tools.
I use their Kettle tool extensively to convert and transform stuff like Excel to database tables, database tables to CSV or XML and similar extraction, transformation and loading operations. One daily scheduled job unloads most of a database to CSV files and FTP-transfers them to an offsite Software-As-A-Service data analysis service. All automatically.
An IDE is no substitute for an Intelligent Developer.