File APIs for Java Developers
Manipulate DOC, XLS, PPT, PDF and many others from your application.
The moose likes Beginning Java and the fly likes using final classes in real life applications Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Java » Beginning Java
Bookmark "using final classes in real life applications" Watch "using final classes in real life applications" New topic

using final classes in real life applications

Peter Kovac
Ranch Hand

Joined: Aug 08, 2010
Posts: 42

Many of the best-practices recommends using final classes to prevent fragile class-hierarchies, but is that actually followed in the applications you work-on?
I know that there are other ways to prevent sub-classing I'm just curious about this particular usage.

Jesper de Jong
Java Cowboy
Saloon Keeper

Joined: Aug 16, 2005
Posts: 15084

Peter Kovac wrote:... but is that actually followed in the applications you work-on?

Certainly. Making a class final is my default choice - I only make it non-final if it's designed to be extended.

Java Beginners FAQ - JavaRanch SCJP FAQ - The Java Tutorial - Java SE 8 API documentation
Junilu Lacar

Joined: Feb 26, 2001
Posts: 6529

In my experience, very few actually follow that recommendation, even when they should. There need to be more developers like Jesper.

Junilu - [How to Ask Questions] [How to Answer Questions]
Paul Clapham

Joined: Oct 14, 2005
Posts: 19973

I have to say that I haven't followed that recommendation either.

However I haven't ever encountered the problem where somebody working on the same code base as me starts extending classes unnecessarily, either, so it hasn't mattered.

Let me also mention that "best practices" change over time. Back when I started writing Java (and code in other object-oriented languages) there was a "best practice" (although that phrase wasn't used back them) that an object should be responsible for everything about that object. So an object should be able to write itself to a database, and to render itself as HTML, and so on. Nowadays "best practice" says that rendering as HTML is part of the view's responsibility, not part of the object's responsibility, and that the database layer should be separated from the business layer.

It only took about a week for me to notice that having an object know how to render itself as HTML was a dumb idea, it happened as soon as I had two pages which needed to display different amounts of detail about the object. But I still have code in which objects know how to update themselves in the database. One day I'll have to try and redo that.
I agree. Here's the link:
subject: using final classes in real life applications
It's not a secret anymore!