Paul Santa Maria wrote:I don't think there's any question that C++ was a dominant force throughout the MS Windows era of the 1990's and early 2000's.
No. AFAIK from as early as 1994, the dominant languages for
application development in Windows at that time was VB, PowerBuilder, VFP, Delphi. C++ was only used for
systems-level Windows development then... and now.
Paul Santa Maria wrote:But with Java/J2EE entrenched in the enterprise, so much work being done in C#/.Net instead of C++ for Windows applications, Objective-C and Java dominating the smartphone/tablet space, and Javascript everywhere imaginable...
How would you write an operating system with any of these? Or a compiler? Or a database engine? You are conflating application development (and very high-level application development) with systems development. The enterprise is just but a sector of the software development market. It doesn't even constitute the majority of deployed systems out there.
Paul Santa Maria wrote:... do you think there's any place left for a language for a language where you can inadvertently wreak havoc if you forget to write a copy constructor when you use a list, or forget to create a virtual destructor for a derived class, or a million other subtleties that so many developers don't even have a clue about?
I think that's a loaded question robbed from much of its objectivity. Many of those gotchas have to do with specific features required for both 1) systems development, and 2) interfacing with C libraries.
Secondly, it's not like Java does not have it's gotchas (making the original assertion less objective.) Man, I made a living for quite a while just cleaning up Java code left by *Java* developers who didn't know about them. A garbage collector is the greatest thing for application development, but unfortunately, way too many developers completely forget how to write apps for it, completely oblivious that Java does not have proper destructors (finalizers are useless and downright dangerous), leading this to database connections and sockets being left in GC limbo until they get reclaimed, if at all.
C++ has many subtleties that many developers don't have a clue about. And Java has many subtleties that many developers don't have a clue about either. But that's the fault of the developer, not the language. It always is.
So it does not make any sense to make that kind of language evaluation since it is the developer's job to know about them. Just as there are guidelines for coding in Java (think "Effective Java" and "Java Gotcha" books), the same exist for C++, and any developer worth his salt (be it Java or C++) abide by them.
Paul Santa Maria wrote:Just asking if you think *other* languages, like Objective-C, Javascript or - yes, Java - might be nudging C++ toward irrelevance.
One would hope that people should be aware there is a lot more development going on outside of the enterprise/web arena. Operating systems, file systems, database engines, device drivers... not to mention that the number of embedded systems running in PIC or AVR chips (think thermometers, microwave controllers, car computers, etc) far outnumber the number of enterprise/web deployments. So the answer is no: we don't see a demise of the "C++ era" (or the "C era" for that matter.)
Why people completely forget this is strange to me. Not trying to be derisive, but I think it is extremely detrimental to not be aware of the entire landscape of computing and software engineering. C and C++ have not been used for widespread application development for the last decade or so, and that is true. In fact, it would be crazy to do so since we now have application-level languages that are much better for that type of tasks.
But to extrapolate the relevance (or lack thereof) of C++ from the point of view of (very high-level) application development does not make any sense.
Imagine an embedded developer that has not done anything but C programming in little 8-bit PIC CPUs (which are pretty much the most widespread deployed type of systems in the world) asking whether Java is relevant since he cannot use the later to program the former. It would be absolutely strange, if not non-nonsensical.
Same here.