• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

Dr Dobbs on functional languages

 
Rancher
Posts: 4803
7
Mac OS X VI Editor Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
In It's Time to Get Good at Functional Programming

Dr Dobbs talks about how the future is multi-core, and that means functional languages such as Scala
 
pie sneak
Posts: 4727
Mac VI Editor Ruby
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Thanks for the article. Good read.

I've heard folks say before that the need for better concurrency management will be the differentiator that will make Scala replace Java, much as how Java replaced C++ with the value of garbage collection.

I'm not completely convinced that most enterprise apps have a need for developers to manage their own concurrency.

Web apps all use the server container to manage each request as a separate thread. I only have to worry about my shared state, keeping that thread safe. Not a big a deal.
 
Pat Farrell
Rancher
Posts: 4803
7
Mac OS X VI Editor Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Marc Peabody:
I'm not completely convinced that most enterprise apps have a need for developers to manage their own concurrency.

Web apps all use the server container to manage each request as a separate thread. I only have to worry about my shared state, keeping that thread safe. Not a big a deal.



I'm completely convinced that most good professional programmers can not properly manage concurrency, and without it, you can't write code to work on multi-CPU systems.

The container managed threads works fine when you can simply spread individual servlet threads onto separate JVMs on separate CPUs. But with that, you don't gain any value from unused cores. There are things like sorts that can be greatly speeded up with multiple cores.

I will grant that with today's quad core systems, its not a huge deal. But within a year or so, expect 8 and 16 core processors, and servers with at least two of these, so 32 core systems are not far away. and 128 core systems will be here far sooner than we expect.
 
Marc Peabody
pie sneak
Posts: 4727
Mac VI Editor Ruby
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Granted, if you have 16 cores and only 8 requests at a given point in time, of course you'll have unused cores.

By my understanding, so long as you have at least as many incoming requests as you have cores, there's not much value in kicking off multiple threads/actors for any given request. The other cores would be busy processing other requests.

Like EFH says in this thread:

In essentially all modern JVM implementations, each Java "Thread" object corresponds to a separate scheduling primitive at the operating system level. If the operating system knows how to schedule threads on multiple cores, then that JVM implementation will automatically take advantage of the multiple cores when running multithreaded code. This is not just a Java 5 or 6 thing; it's been true since before consumer multicore processors existed. It's true on multiprocessor motherboards, too.



Now, if you're writing a Swing application, there's a good chance only one thread gets used for most processing. Obviously such an application could utilize more cores by multi-threading intensive processes.

It's not as if all Java applications will be damsels in distress waiting in multicore towers for concurrent programming to save them. A good number of applications are multithreaded without a developer even needing to think about it, simply because the application runs on a container that manages threads.
 
Ranch Hand
Posts: 490
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Saying the future is multi-cores is over-simplistic.

There are domains where ramping up to 8 or more cores will help greatly, even though it adds complexity to writers or managed code systems and compilers, and in many cases average developers.

However, general use desktop apps do not have much to gain from quad core, much less 8 or 16 core systems. Many applications simply are not parallelizable. In data driven applications it is certainly important and will become more important. Event driven? Not so much.

Personally, I don't like functional languages, they are too rigid and its proponents are insufferably arrogant and anal-retentive.
 
Pat Farrell
Rancher
Posts: 4803
7
Mac OS X VI Editor Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Rusty Shackleford:
Saying the future is multi-cores is over-simplistic.

There are domains where ramping up to 8 or more cores will help greatly, even though it adds complexity to writers or managed code systems and compilers, and in many cases average developers.

However, general use desktop apps do not have much to gain from quad core



No, the future is multi-core is clear from Intel and AMD's roadmaps, and its a direct result of Moore's law. Every $500 PC desktop sold today has at least two cores.

I completely agree that "average developers" have zero chance of writing effective Java code that uses 8 cores. Your statement that general user desktop apps is nearly proof, its hard to write good concurrent code.

While it was a few years ago, I believe my experience in grad school holds: I was in a 800 level parallel processing class, that's PhD level. We studied all sorts of cool parallel hardware, specialized, etc. The final class project was to do some fairly simple simulations using 4, 8, or 16 processors. Of the eight PhD students in the class, only two made code that was faster in parallel than sequential. I was one of the six students whose parallel code was slower.

The same arguments you make against functional programming were made from machine code to assembly language, and from assembly language to systems languages such as Bliss and C. And its partly true, the best genius developers can program better than most folks, even if the average folks have C and the best have to toggle in opcodes.

The problem is that there are not enough genius programmers. So we need tools to make it accessible to the good professional developers, who are already smarter than the average bear.
 
Rusty Shackleford
Ranch Hand
Posts: 490
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
It is a far cry going from 2 cores on the desktop to 16 or more.

If anything, the future of multicore on the desktop will be that where the OS intelligently assigns applications to a specific core. For example, kernel stuff runs on 1 core, since it is likely windows we are talking about core 2 can run all the AV/AS garbage, and cores 3 and 4 can run apps the user is running. That will give you far more benefit then trying to turn sequential problems into parallel ones. But stretching that past 4 cores on the desktop? No real gain.

Moore's law is an observation, not an actual law or theory.

The ONLY reason why Intel is pushing it is because the think they need to constantly improve its chips to make more money. No other reason. What they need to be doing is improving memory speeds, so L1 and L2 won't be so important and optimizing L1 and L2, not searching for a solution to a non-existent problem. Don't forget, 10Ghz processors and Rambus(RDRAM) memory used to be on Intel's roadmap. A business plan doesn't force need.

It is not that it is hard to write parallel code, I don't think it is, and I have written programs that can run on 40+ processors, not overly complex ones, but whatever. The issue is that most desktop applications are event driven, they are simply not very parallelizable. Not every algorithm can be split up to run concurrently. Take a word processor, small tasks like spell checkers and whatnot can benefit a little bit, but hardly anything in this browser that I am typing this drivel can.

The overhead in dealing with multiple caches that need to sync and communicate(and communications between 'nodes' in the program running concurrently) with each other, not to mention L1, L2, and main memory, that unless a problem really lends itself to parallelism, you are likely going to get worse performance. If you do see a performance gain, it is often negligible, not at all worth the time and trouble to rewrite your program. For the most part, improved hardware has brought us sloppy, bloated code, not screaming fast machines. Programmers have relied on "Moore's Law' to hide their poor programming. That time is quickly passing, thankfully.

Granted, in a purely functional language everything is trivially parallelizable, due to the fact that function calls have no side effects, but good luck getting development shops to not only switch languages but a totally new paradigm, especially for little gain. The cost of that switch against meager performance gains makes it unreasonable.

The problem is that in most domains, you don't gain anything by implementing some concurrency. It is not a silver bullet.

Hardware is so far ahead of software, that it is pointless to chase after nonsense that Intel is doing to try and boost its bottom line.

Note, that I am discussing desktop apps, there are plenty of cases where systems with many cores is valuable, but they are in areas that the average computer user will likely never hear about it nor need that amount of processing power.
 
Pat Farrell
Rancher
Posts: 4803
7
Mac OS X VI Editor Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Rusty Shackleford:
Moore's law is an observation, not an actual law or theory.



Well, it has defined the actual progress in the industry for 30+ years.


The ONLY reason why Intel is pushing it is because the think they need to constantly improve its chips to make more money.



Huh? Intel's only purpose in life is to make money. They are a corporation. And the make money by releasing ever faster chips, which enable technology to grow. They can't make single CPUs much faster. So there would be no reason for folks to buy new ones. No revenue for Intel.

Multiple cores are the only solution that anyone can see.


It is not that it is hard to write parallel code, I don't think it is, and I have written programs that can run on 40+ processors, not overly complex ones, but whatever.



Then you are way smarter than me.


Note, that I am discussing desktop apps, there are plenty of cases where systems with many cores is valuable, but they are in areas that the average computer user will likely never hear about it nor need that amount of processing power.



I was not writing about desktop apps. There are nearly no desktops. For the past decade, I've written only webapps. I see nothing to change that pattern. I believe, IMHO, that less and less will be done on "desktops" and everything will be done with GMail and the equivalent. I could be wrong, but that's where I see the industry going. A "computer" in a few years will be an iPhone using fast wireles networking.
 
Marc Peabody
pie sneak
Posts: 4727
Mac VI Editor Ruby
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Rusty Shackleford:
Take a word processor, small tasks like spell checkers and whatnot can benefit a little bit, but hardly anything in this browser that I am typing this drivel can.


Funny you should mention this. I was talking to another developer last night who is returning very, very large reports to a browser. IE was taking close to four minutes. Chrome, which contains optimizations for multi-threading, takes 2 seconds.

But that difference isn't for the average developer to worry about. The Microsoft and Google geniuses get to worry about that.
 
Rusty Shackleford
Ranch Hand
Posts: 490
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Huh? Intel's only purpose in life is to make money. They are a corporation. And the make money by releasing ever faster chips, which enable technology to grow. They can't make single CPUs much faster. So there would be no reason for folks to buy new ones. No revenue for Intel.



But that doesn't meant that we should buy them.

I was not writing about desktop apps. There are nearly no desktops. For the past decade, I've written only webapps. I see nothing to change that pattern. I believe, IMHO, that less and less will be done on "desktops" and everything will be done with GMail and the equivalent. I could be wrong, but that's where I see the industry going. A "computer" in a few years will be an iPhone using fast wireles networking.



What display those webapps and where do they live? The average computer user is on the view end of those web apps, and the web programming, like everything else, the hard problems are done by people below you.

I struggle to write a simple compiler and consider compiler writers the top of the CS world(along with OS and soft computing guys), because of the problems they deal with, pipeline, out of order execution, multiple cores, etc. That is tough work, writing a concurrent program(assuming that it is inherently parallel), is nothing like compiler writing.

There are nearly no desktops.





I very seriously doubt that I am smarter than you.
[ December 10, 2008: Message edited by: Rusty Shackleford ]
 
Pat Farrell
Rancher
Posts: 4803
7
Mac OS X VI Editor Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Rusty Shackleford:
But that doesn't meant that we should buy them.



We don't have to, the users will. No one wants an "old" computer. Not that a single core, say ~2gHz, proccessor isn't fast enough for nearly anything. So Intel invents stuff. Even watching HDTV isn't really a challenge for ancient systems.

Except I end up buying ever faster computers because I write ever larger programs, and I hate to wait for the compile/link stage. Much more, I hate paying the engineers that I've hired to wait for the compile/link stage. It doesn't take much developer productivity improvement to justify buying a new machine every year or so for the engineers. You can always give the engineers' machines to marketing and management.

What display those webapps and where do they live? The average computer user is on the view end of those web apps, and the web programming, like everything else, the hard problems are done by people below you.



All the webapps are running on the Browser-OS from Google-soft.
And while a portion of the code may be in Java, folks who care about that are not too likely to be hanging around Java Ranch.

We're much more likely to be mortals.

I struggle to write a simple compiler and consider compiler writers the top of the CS world(along with OS and soft computing guys), because of the problems they deal with, pipeline, out of order execution, multiple cores, etc. That is tough work, writing a concurrent program(assuming that it is inherently parallel), is nothing like compiler writing.



Simple compilers are typically a one semester course in Grad School, or senior level at a hard school. Optimization is a lifetime. I dabbled with it for two semesters in grad school. Kinda fun. It convinced me that you need a program to write optimized code these days, branch delay slots and out of order is simply impossible to do by hand.

I don't know, but I would not be surprised to find that something like GlassFish has more code than most operating systems -- if you consider just the kernel, drivers and basic functions. Of course, there is a big jump from that to what we call an OS these days, with windows, networking, etc. Realistically, Glassfish, JBoss, etc. are operating systems, they just run on a higer level.

Most programmers write applications because that's where the need is. There are not all that many folks writing OS and compilers, relative to the millions writing applications.

And if Scala or something else improves the programmer efficiency, then I'm all for it.
 
Greenhorn
Posts: 3
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Rusty Shackleford:
Saying the future is multi-cores is over-simplistic.

Personally, I don't like functional languages, they are too rigid and its proponents are insufferably arrogant and anal-retentive.



Fortunately, Scala is an object oriented language as well. And I'm a proponent who is quite sufferably arrogant.

I am curious what makes you think functional languages are restrictive.
 
reply
    Bookmark Topic Watch Topic
  • New Topic