This week's book giveaway is in the OO, Patterns, UML and Refactoring forum. We're giving away four copies of Refactoring for Software Design Smells: Managing Technical Debt and have Girish Suryanarayana, Ganesh Samarthyam & Tushar Sharma on-line! See this thread for details.
Is R intended to satisfy the requirement of performance for large scale enterprise application, in terms of high speed and high frequency processing, concurrent computing and scalibility, when it is integrated with other languages like Java or C++ or used solely? Does the book R in Action cover similar topics?
With the massive stores of data now being collected, this is an increasingly important question. R was orginally designed to handle moderate to large amounts of data (in the megabyte and gigabyte range). It keeps the data in memory, which leads to a zippy experience for interactive users, but creates limits for very large datasets. Most users keep their data in external databases or data warehouses and access portions of it through R's extensive DBMS access routines.
Appendix G in "R in Action" describes working with large datasets.
Joined: Jun 06, 2010
Thanks for your reply, Robert.
I was thinking that R should have some kind of feature of scalability available for any package so that package developer and R's user can focus on their scientific computing or application integration without need to worry about its performance particularly scalibility. Maybe my thought is too luxury, since it's hard for any computing language.
I’ve looked at a lot of different solutions, and in my humble opinion Aspose is the way to go. Here’s the link: http://aspose.com