Hi Larry,
I was a SAS programmer for 25 years before I ever looked at R. I started using it because it made certain tasks easy. In particular, it made understanding data that resided in external databases a simple task. R is very good at accessing all manner of DBMS.
The problem with massive amounts of data is how to understand it and discern
patterns that may be important.
R is frequently used to summarize data, create predictive models, uncover unusual observations (e.g., fraud, data entry errors, serendipitous findings), and create pictures that communicate complex information simply. It is particularly effective at creating graphs that describe change over time.
One of the hot terms in business systems these days is
predictive analytics. I would argue that R is the bleeding edge platform for predictive analytics.
Having said that, there are things that I would
not use R for, including managing large databases (in the gigabyte to terabyte range) and massaging large amounts of text data. Even though it can do it, there are much better tools (e.g., Dedicated DBMS solutions, SQL, Perl, Python) for such tasks.