I am looking for views on this:
We have data of size 10 TB(terabytes), stored in multiple disks. Metadata (data describing data like filename, its location, author, description etc.) can go in GB(gigabyes) say 5 GB. To develop a web based application, should metadata be stored in xml files or in a database like oracle, mysql etc.
Since data is going to increase in future, scalability is required. Which approach will give better performance?
It will be like a user wants to find data matching a particular criteria e.g. all files generated between specified start date and end date, extracting required data and analysing it to give statistics, generate plot etc. At runtime, we are generating results, so user should get good performance.
As xml file will be larger, so can't use DOM, but Is using SAX parser scalable and gives good performance?
Thanks
Ashish