When I was young software development was not in the spotlight. We had quite a bit of time to get our work done. We would carefully craft things, focusing on the key issues.
It was the dawn of the World Wide Web followed by the decadence of the Dot Com era that changed all of that. Suddenly “first mover advantage” outweighed quality, correctness, and readability.
Modern coding is a high-speed game of chicken. It starts with a request to do some work in usually less than 1/3rd of the amount of time you need to do a good job. If you balk at the lack of time, they’ll take their work elsewhere. So, you might try to stretch it out a little, but then you agree.
When time is compressed, you inevitably end up taking a lot of shortcuts. Some programmers know to avoid many of these, but the industry tends to praise them.
A shortcut is a tradeoff. You do something faster now, in the hopes that it will not blow up in your face later.
Some shortcuts never blow up, you get lucky.
Some just are incremental aggravations that if they haven’t built up too deeply will only slow you down a bit later. Just friction.
Some shortcuts, however, will implode or even explode, throwing the whole affair into the trash bin or flatten it forever. It’s been bad enough that I’ve actually seen code come out spectacularly fast then spent half of a decade slogging through near-hopeless bugs. The wrong series of really bad shortcuts can be devastating.
So every shortcut is a risk. But it is hard to quantify, as there are usually aggravating factors that multiply the damage.
Given that you are inevitably pushed into having to take some shortcuts, it’s best to take the least destructive ones. Those tend to be higher up.
If you build code in a rational manner, you would lay out the foundations first and then carefully stack a lot of reusable components on top. That is the minimum amount of work you need to do.
Bad low-level code propagates trouble upward; the stuff built on top needs to counteract the awful behavior below. That tells us that the lower the shortcut, the more risky it is, the more it affects, and the worse the consequences of losing by taking it.
We see that all of the time.
Those systems, for example, where they did crazy fast things with saving the data, then wrote far too much hacky code above to try and hide the mess. If they had just modeled the data cleanly, then the tragically nested conditional nightmare piled on top, which ate huge amounts of time and spread a lot of pain, would not have been necessary. It is a super common example of a small set of shortcuts going rather horribly wrong.
You see exceptionally bad persistence all over the place causing problems. It’s likely that at least half the code ever written is totally unnecessary.
What’s always true is that if you take too many risks and lose enough of them, the time saved by the shortcuts will be massively overwhelmed by the time lost dealing with them. Coming out of the gate far too fast will always cause a project to stumble and will often cause it to lose the race.
If you are forced to take risks then it is worth learning how to evaluate them correctly. If you pick the right ones, you’ll lose a few, but keep on going. It’s not how it should be, but it is pretty much how it is these days.
No comments:
Post a Comment
Thanks for the Feedback!