OOP: The worst thing that happened to programming
alexanderdanilov.devAbsolutely is.
Modern OOP (not the original OOP by Alan Kay) is a human anti pattern.
It commits the cardinal sin to easy to understand systems: It hides state, and breaks data lineage.
In otherwords:
1. You cannot just go back up the stack to see if anyone has changed data you depend on. You also need to follow all parent and sibling branches.
2. And in the case of inheritance you cannot reason about Child A without understanding Parent 1..N
As a result OOP systems quickly hit the limit of context one developer can hold in their brain when developing and debugging.
FP on the other hand encourages and in some cases enforces you to encapsulate the inputs and outputs of your system to the arguments and values of a function. Making the system easy to reason about at any level.
Powerful composability and more thorough and easy testing are just beautiful by products.
Next on the list of worst things to happen to programming is Python's popularity as a CSC101 language, and its toe hold in mathematics with the rise of ML.
The interpretation of Alan Kay's view on OOP is that it's not objects that are important, it's messaging.
Which shows, once again, that naming things is hard. Should've called it Message Oriented Programming.
> Next on the list of worst things to happen to programming is Python's popularity as a CSC101 language
My school kept track of computer science graduates, and the numbers dropped sharply after copying MIT's example for their intro course. And predictably it was 4 years after the change.
Some might call that "Gatekeeping" (though that's a more recent word in the vernacular), but I think it's more 90% of the jobs were C/C++/Java back then, and a BS degree was meant to get a graduate in a job in the real world.
Also students dropping out of the computer science program wasn't a great look when requesting funds for servers and stuff.
Since you are all so enthusiastic about Kay's idea of object orientation, you should take a look at Wirth's Oberon language and operating system, which is inded a message-based object system and uses a message-passing architecture rather than virtual method dispatch (in contrast to Smalltalk): https://www.projectoberon.net/
Spot on. OOP is too easy to cause cyclomatic complexity unless you understand the domain beforehand, and that's often not the case.
I don't think this is a great article but if you hit Google Scholar and look for papers concerning OOP you'll be hard pressed to find any recent ones. Almost every programming language research paper is about functional programming. Recent practical crypto papers seem to use Go a lot but that isn't OOP.
OOP was a dead end and academia has moved on if they were ever interested in the first place. It is strange that industry is 180 degrees out of phase here even as they stress the importance of "computer science fundamentals" like data structures and algorithms.
Worst thing that happened to programming, eh? Have you tried running a 2 year old javascript/node project that transpiles and gulps it's three billion dependencies into something alien - if it works. Which it won't because it hasn't been updated for 2 years.
What has this to do with FP vs OOP
Nothing. It explicitly was about the statement of the scale of the issue.
Being extremely enthusiastic or extremely angry about OOP is so 1990s. Tell us, is Java the New COBOL? Is Visual C++ COM/OLE inherently bloated Microsoft Bob Windows Longhorn software?
The article presents OOP and FP as mutually exclusive paradigms where one must be entirely wrong. In reality, modern software development benefits from hybrid approaches. Most current languages support both paradigms, and experienced engineers choose appropriate tools for specific problems.
The interview question about static constructors and self-instantiating programs represents anti-patterns that professional OOP developers also avoid, not inherent OOP features. This is equivalent to judging FP by its worst callback-hell examples.
The ad hominem attack on prominent OOP authors doesn't improve the quality of the article, and dismissing patterns as "crutches" ignores that FP has equivalent patterns (monads, functors, lenses).
What's mutually exclusive is immutability though.
There is an enormous difference when mutability is opt-in.
Most OOP languages seem to require mutability, I'm not sure if there is a possibility to avoid it.
I guess elixir could count as oop, but not in the canonical terms, just according to Alan Kay definition
Well, OCaml is immutable by default, isn't it? Rust has no inheritance, but at least supports polymorphism, and ordinary bindings are immutable. There are OO languages where values of an object can only be set at construction time, or objects can be constant, or types can be private (as e.g. in Ada), etc.
From my understanding OCaml is considered an FP language, and Rust I have no idea, it's a very special case.
The problem with mutability by default is that immutability loses so many advantages...
Caml is/was an FP language. OCaml has an "O" prefix because it is an OO & FP multiparadigm language.
I thought that this was going to be a discussion about this old HN classic:
https://news.ycombinator.com/item?id=8420060
PS - don’t click the smashcompany link!!! The essay appears to have been replicated here:
https://medium.com/@jacobfriedman/object-oriented-programmin...
Is this a new way for Russia to undermine the West?
> why experienced Java (C#, C++, etc.) programmers can’t really be considered great engineers, and why code in Java cannot be considered good
...how was this written in 2025? This is like mid-2000s edgelord stuff.
Speaking to the choir with me :)
But I would add "so far", AI and vibe could very well overtake OOP in a year or 2.
No-one uses that original OOP at all, no-one sane anyway. The way its used now is for dependency injection. All your logic is in services that are injectable and unit tested. All your data is in simple immutable DTOs.
All the OOP tricks, classes, instances, interfaces, polymorphism, its all good for wiring up your logic, replacing bits at runtime. No-one actually models their domain with pure OOP. Urgh, that would be awful.
But also to echo other commenters, this isn't interesting insight...