But as the abstractions grew more sophisticated, there was a backlash. The industry was exploding in size, and with more new people, a lot of programmers wanted things to be simpler and more independent. Leveraging abstractions requires learning and thinking, but that slows down programming.
So we started to see this turn towards fragmented technologies. Instead of putting your smarts all in one place, you would just scattershot the logic everywhere. Which, at least initially, was faster.
If you step back a bit, it is really about individual programmers. Do you want to slowly build on all of these deep, complicated technologies, or just chuck out crude stuff and claim success? Personal computers, the web, and mobile all strove for decentralization, which you leveraged with lots of tiny fragments. Then you only had to come up with a clever new fragment, and you were happy.
Ultimately, it is an organizing problem. A few fragments are fine, but once there are too many, the complexity has been so amplified by the sheer number of them that it is unmanageable. Doomed.
Once you have too many, you’ll never get it stable; you fix one fragment, and it breaks a couple of others. If you keep that up, eventually you cycle all the way back around again and start unfixing your earlier fixes. This is pretty much guaranteed at scale, because the twisted interconnections between all of the implicit contextual dependencies are a massive Gordian knot.
Get enough fragments, and it is over. Every time, guaranteed.
Oddly, the industry keeps heading directly into fragmentation, promoting it as the perfect solution, then watching it slowly blow up. After which it will admit there was a problem, switch to some other new fragmented potential, and do it all over again. And again.
I guess microservices have become a rather recent example.
We tried something similar in the early '90s, but it did not end well. A little past the turn of the century, that weed sprang up again.
People started running around saying that monoliths are bad. Which isn’t why true, all of your pieces are together in one central place, which is good, but the cost of that is limits on how grand you can scale them.
The problem isn’t centralization itself, but rather that scaling is and never will be infinite. The design for any piece of software constrains it to run well within just a particular range of scale. It’s essentially a mechanical problem dictated by the physics of our universe.
Still, a movement spawned off that insisted that with microservices, you could achieve infinite scaling. And it was popular with programmers because they could build tiny things and throw them into this giant pot without having to coordinate their work with others. Suddenly, microservices are everywhere, and if you weren't doing them, you were doing it wrong. The fragmentation party is in full swing.
There was an old argument on the operating system side between monolithic kernels and microkernels. Strangely, most of the industry went with one big messy thing, but ironically, the difference was about encapsulation, not fragmentation. So what we ended up with was one big puddle of grossly fragmented modules, libraries, and binaries that we called a monolith, since that was on top. Instead of a more abstracted and encapsulated architecture that imposed tighter organizational constraints on the pieces below.
So it was weird that we abused the terminology to hide fragmentation, then countered a bit later with a fully fragmented ‘micro’ services approach with the opposite name. Software really is an inherently crazy industry if you watch it long enough.
These days, there seems to be a microservices backlash, which isn’t surprising given that it is possibly the worst thing you can do if you are intentionally building a medium-sized system. Most systems are medium-sized.
People started running around saying that monoliths are bad. Which isn’t why true, all of your pieces are together in one central place, which is good, but the cost of that is limits on how grand you can scale them.
The problem isn’t centralization itself, but rather that scaling is and never will be infinite. The design for any piece of software constrains it to run well within just a particular range of scale. It’s essentially a mechanical problem dictated by the physics of our universe.
Still, a movement spawned off that insisted that with microservices, you could achieve infinite scaling. And it was popular with programmers because they could build tiny things and throw them into this giant pot without having to coordinate their work with others. Suddenly, microservices are everywhere, and if you weren't doing them, you were doing it wrong. The fragmentation party is in full swing.
There was an old argument on the operating system side between monolithic kernels and microkernels. Strangely, most of the industry went with one big messy thing, but ironically, the difference was about encapsulation, not fragmentation. So what we ended up with was one big puddle of grossly fragmented modules, libraries, and binaries that we called a monolith, since that was on top. Instead of a more abstracted and encapsulated architecture that imposed tighter organizational constraints on the pieces below.
So it was weird that we abused the terminology to hide fragmentation, then countered a bit later with a fully fragmented ‘micro’ services approach with the opposite name. Software really is an inherently crazy industry if you watch it long enough.
These days, there seems to be a microservices backlash, which isn’t surprising given that it is possibly the worst thing you can do if you are intentionally building a medium-sized system. Most systems are medium-sized.
Whenever you try to simplify anything by throwing away any sort of organizing constraints, it does not end well. A ball of disorganized code, data, or configs is a dead man walking. Even if it sort of works today, it’s pretty much doomed long before it pays for itself. It is a waste of time, resources, and effort.
All in all, though, the issue is just about the pieces. If they are all together in one place, it is better. If they are together and wrapped up nicely with a bow, it is even better still.
If they are strewn everywhere, it is a mess, and what is always true about a mess is that if it keeps growing, it will eventually become so laborious to reverse its inherent badness that starting over again is a much better (though still bad) choice.
All in all, though, the issue is just about the pieces. If they are all together in one place, it is better. If they are together and wrapped up nicely with a bow, it is even better still.
If they are strewn everywhere, it is a mess, and what is always true about a mess is that if it keeps growing, it will eventually become so laborious to reverse its inherent badness that starting over again is a much better (though still bad) choice.
The right answer is to not make a mess in the first place, even if that is slower and involves coordinating your work with a lot of other people.
The best answer is still to get it all into reusable, composible pieces so that you can leverage it to solve larger and larger problems quickly and reliably. That has been and will always be the most efficient way forward. When we encapsulate, we contain the complexity. When we fragment, it acts as a complexity multiplier. Serious software isn’t about writing code; it is about controlling complexity. That has not changed in decades, even though people prefer to pretend that it has.
The best answer is still to get it all into reusable, composible pieces so that you can leverage it to solve larger and larger problems quickly and reliably. That has been and will always be the most efficient way forward. When we encapsulate, we contain the complexity. When we fragment, it acts as a complexity multiplier. Serious software isn’t about writing code; it is about controlling complexity. That has not changed in decades, even though people prefer to pretend that it has.
No comments:
Post a Comment
Thanks for the Feedback!