This way, compute resources could be used much more economically.

Despite the ingenuity of the idea, it would takeuntil 1981until object-oriented programming hit the mainstream.

Since then, however, it hasnt stopped attracting new and seasoned software developers alike.

Object-oriented programming is dead. Wait, really?

Themarketfor object-oriented programmers is as busy as ever.

But in recent years, the decade-old paradigm has received more and morecriticism.

Could it be that, four decades after object-oriented programming hit the masses, technology is outgrowing this paradigm?

Article image

It’s free, every week, in your inbox.

Is coupling functions with data that stupid?

One can only interact with the contents of an object through messages, typically called getter and setter functions.

Inheritance basically means that developers can define subclasses that have all the properties that their parent class has.

This wasnt introduced to object-oriented programming until 1976, a decade after its conception.

Polymorphism came to object-oriented programminganother decade later.

In basic terms, it means that a method or an object can serve as a template for others.

This makes life easier for developers because they dont have to worry about dependencies at runtime.

Its worth mentioning that inheritance and polymorphism arent exclusive to object-oriented programming.

The real differentiator is encapsulating pieces of data and the methods that belong to them.

In a time where compute resources were a lot scarcer than today, this was a genius idea.

What prevailed before the 1980s, procedural programming, was very machine-oriented.

Developers needed to know quite a bit about how computers work to write good code.

By encapsulating data and methods, object-oriented programming made software development more human-centered.

When inheritance came around, that was intuitive, too.

It makes perfect sense thatHyundaiis a subgroup ofcarand shares the same properties, butPooTheBeardoes not.

This sounds like a powerful machinery.

Its like when people see nails everywhere because all they have is a hammer.

you might reuse the class from the old project for your new one.

You wanted a banana but what you got was a gorilla holding the banana and the entire jungle.

That pretty much says it all.

Its fine to reuse classes; in fact, it can be a major virtue of object-oriented programming.

But dont take it to the extreme.

The fragile base class problem

Imagine youve successfully reused a class from another project for your new code.

What happens if the base class changes?

It can corrupt your entire code.

You might not even have touched it.

The more you use inheritance, the more maintenance you potentially have to do.

But what if you want to mix the properties of two different classes?

Well, you cant do it.

At least not in an elegant way.

Consider for example the classCopier.

A copier scans the content of a document and prints it on an empty sheet.

So should it be the subclass ofScanner, or ofPrinter?

There simply is no good answer.

The hierarchy problem

In the diamond problem, the question was which classCopieris a subclass of.

But I lied to you there is a neat solution.

LetCopierbe the parent class, andScannerandPrinterbe subclasses that only inherit a subset of the properties.

But what if yourCopieris only black-and-white, and yourPrintercan handle color, too?

IsntPrinterin that sense a generalization ofCopier?

What ifPrinteris connected to WiFi, butCopieris not?

The more properties you heap on a class, the more difficult it becomes to establish proper hierarchies.

The reference problem

You might say, alright, then well just do object-oriented programming without hierarchies.

Instead, we could use clusters of properties, and inherit, extend, or override properties as needed.

Theres just one problem.

This doesnt work without strict hierarchies.

Consider what happens if an objectAoverrides the hierarchy by interacting with another objectB.

It doesnt matter what relationshipAhas withB, except thatBis not the direct parent class.

ThenAmust contain a private reference toB, because otherwise, it couldnt interact.

But ifAcontains the information that the children ofBalso have, then that information can be modified in multiple places.

Therefore, the information aboutBisnt safe anymore, and encapsulation is broken.

Although many object-oriented programmers build programs with this kind of architecture, this isnt object-oriented programming.

Its just a mess.

Theyre just examples of a dogma taken too far.

Not only object-oriented programming can be overdone, though.

In purefunctional programming, its extremely difficult to process user input or print messages on a screen.

Object-oriented or procedural programming is much better for these purposes.

Using another paradigm, they could have easily reduced their code to a couple of readable lines.

Paradigms are a bit like religions.

Theyre good in moderation arguably, Jesus, Mohamed and Buddha said some pretty cool stuff.

The same goes for programming paradigms.

It makes sense to get informed about new programming paradigms and use them when appropriate.

Functional and object-oriented programmers alike, stop treating your paradigms like a religion.

Theyre tools, and they all have their use somewhere.

What you use should only depend on what problems you are solving.

The big question: are we on the cusp of a new revolution?

More and more problems are coming up where functional programming is often the more efficient option.

Think data analysis, machine learning, and parallel programming.

The more you get into those fields, the more youll love functional programming.

The most likely scenario is that object-oriented programming will stay around for another decade or so.

Sure, the avant-garde is functional, but that doesnt mean you should ditch object-oriented yet.

Its still incredibly good to have in your repertoire.

So dont throw object-oriented programming out of your toolbox in the next few years.

But confirm that its not the only tool you have.

This article was written byAri Jouryand was originally published onTowards Data Science.

you could read ithere.

Also tagged with