Saturday, April 19, 2014

What and How

The real work in programming comes from building up massive sets of instruction for the computer to follow blindly. In order to accomplish this, modern programmers rely on a huge number of ready made instruction sets, including the operating system, frameworks, libraries and other large scale infrastructure like databases and web servers. Each one of these technologies comes with an endless number of APIs that provide all of the functional primitives required for interaction.

A modern programmer thus has to learn all sorts of different calling paradigms, protocols, DSLs, formats and other mechanisms in order to get their machines to respond the way that is desired. Fortunately the world wide web provides a seemingly endless collection of examples, reference documents and discussions on utilizing all of this underlying functionality. All a programmer need do is read up on each different collection of functionality and then call it properly. But is that really enough?

Underneath all of the peices are literally billions of lines of code, written over a 30-40 year period. We've pile it so high that it is easy to lose touch with the basic concepts. Programmers can follow examples and call the appropriate functions, but what happens when they don't work as expected? What happens during the corner cases or disruptions? And what happens when the base level of performance given isn't even close to enough?

In order to really get a system to be stable and perform optimally it is not enough to just know 'what' to do with all of these calls and interfaces. You have to really understand what is going on beneath the covers. You can't write code that handles large amounts of memory, if you don't understand virtual memory and paging in the operating system. You can't write caching if you don't understand the frequency of the cached data and how to victimize old entries. You can't push the limits of an RDBMS, if you don't get relations or query optimization. Nor can you do error handling properly if you don't undstand transaction semantics. 

Examples on the Web aren't going to help because they are generally lightweight. Nothing but a full sense of 'how' these things actually work will allow a programmer to correctly utilize these technologies. They can sling them around for small, simple programs, but the big serious jobs require considerable knowledge. 

It's much harder for younger programmers to get a real sense of 'how' things work these days. The technologies have become so complex and so obfuscated that unless you grew up watching them evolve, you can't grok their inner madness. The myths we have about younger programmers being better value for the cost are really predicated on an era of simpler technology. We have surpassed that point now in most of our domains. Slinging together code by essentially concatenating online examples may appear quick, but software is only really 'engineered' when it behaves precisely as the builder expects, and it does this not just for the good days, but also for the very bad ones. Understanding 'how' all this functionality works underneath is unavoidable if you need to build something that is resilant in the real world. 

This complexity shift is quietly underlying the problems that we are having in trying to effective leverage our technologies. Newbie programmers might be able to build quickly, but if the secondary costs of maintaining the system are stratopheric, the value of the system could be turned negative. In our rush to computerize, few paid attention to this, we were too busy heralding the possible benefits of building the next wave of systems, but it seems as if our tunnel vision is quickly catching up with us. 

If computers are really doing what they can, life should be both easier and more precise. That is, our tools should take care of the simple and stupid problems allowing us to focus on higher level concerns. But as anybody out there who has lots of experience with utilizing modern software knows, instead we spend our days bogged down in an endless battle against trivial problems, leaping from silo to silo. Some simple problems have vanished, like copying bits from one machine to the next, but they've been replaced by far more silliness and a plague of partially usable tools.

If we are going to make good on the potential of computers, more people need to really understand 'how' they work under the covers. And as they understand this, they'll be able to craft better pieces that really will simplify our lives, rather than complicate them. Just being able to sling together a larger collection of instructions for a machine does not mean that thoses instructions are good, or even usable. The great mastery in programming is not in being able to code, but rather in knowing what code is helpful and what code causes harm.

Saturday, April 12, 2014

Collective Protection

Over the last few decades of my career, at least half of my working experiences has been really good. They've been great opportunities to solve complex problems, learn new domains and to build things which make people happy. I've been proud of the things I've built even though none of them are famous or made me any significant money. My favorite jobs have all been wrapped in a strong engineering culture; one that strives to always do the right thing, pay attention to the details, avoid politics and focus on getting the best quality possible. 

The other half of my jobs, however, have not been great experiences. Most of them were dominated by horrible people or bad cultures. In those circumstances, you try to do what you can to improve things, but generally the deck has been stacked against you. The only real option is to vote with your feet. To walk away. And I've had to do that many times, although sometimes I wasn't able to escape as early as I wished.

In a really bad environment you get pummelled by what I call 'unqualified decisions'. That is bad choices made by someone in an authority position that doesn't have the background or knowledge to make a good choice. Back in the dotcom boom, when software development became popular it attracted a great deal of outsiders. It still does. These are people with little or no direct experience in serious software development. They know enough to be dangerous and they are usually over-confident enough to be convincing, at least to upper management. Once someone like that takes root in a high enough position, nothing a software developer is going to say will prevent trouble. Since we've never really gelled into a recognized profession, a software developer has less credibility as seen from above, when compared to a non-technical egotistical jerk. And without credibility, any attempt to escalate problems, concerns or prevent outright disasters will fail miserably. There is just nothing you can do if your stakeholders don't take you seriously or want to listen. 

What has often occured in this type of scene is that an increasing number of bad decisions flow downhill, causing bigger problems, which usually spurs on more bad decisions. The trouble makers get aggressive  if the engineers point to the real underlying causes, so the enginners just end up passive and do whatever they are told to do until they get their opportunity to leave. Often it can be really sad as it amounts to the slow and steady death of some previous work. The antagonists don't even catch the blame for their mistakes because they attribute the faults to departing staff. They quite often get away with it, and sometimes even have long running careers hopping from one organization to the next replaying the same sad scenerio. Life is tough.

Besides walking away, I've often wondered if there was some other solution for this type of problem. Obviously if software developers were respected as a profession, their advice and opinions would carry more weight during personality or directional conflicts. But we're an immature industry and we can't even agree on basic standards or titles, let alone some larger knowledge base that's respected as professional. Our industry track record is quite horrible right now. Security breaches, information theft and the rapid erosion of everyone's privacy are all tragedies that are laid at our feet. At its worst, software has killed people, brought down companies and wreaked havoc in the world markets. Most people's home computers are plagued with an endless series of nasty viruses and the one time that we all pulled together to avert a technological calamity, Y2K, it turned out that the problem was grossly over-rated. No wonder people don't trust us. Don't believe that we are professionals. And even think that the whole mess is just some random performance art where we just hack away madly while hoping that it will some how magically work well-enough in the end.

As a 'profession' we are not taken seriously. There are pockets of goodness, but even in many software companies we are still the downtrodden. Crazy coders that whack out instructions that most of our handlers believe that they could do trivially, if only they wanted to spend a little more time on it. That's why we lose so often in personality clashes with these people and that's why so much of what we build these days is reckless crap thrown together in a fraction of the time that would have been needed to do it properly. We even have popular movements that twist our own values to allow for increased micro-management within ever shortening spasms of coding panic. Ideas that might sound fun, but any other sane profession would immediately see it as demeaning and fight against it. We, on the other hand, just keep getting sucked in again and again.

Maybe some day we'll mature into a real profession, but we've actually moved farther away from that during my career. The rapid growth of the dotcom age diluted our respect and the inevitable implosion at the end sealed the deal. Our latest security and privacy debacles aren't helping either. 

Our last ditch idea, the hail Mary pass of any oppressed workforce, is the one that 25 years ago I could never imagine myself suggesting. I'm still not sure I like it, but I don't see any viable alternatives. At the turn of last century, profits and greed were out of control. Blue collar workers were at the mercy of their employers and the world for many was an unhappy place. Robert Reich, in his film Inequality for All, pointed to the decrease in unions as being one of the causes of a shrinking middle class in North America. Not explicitly mentioned, but I think worth noting has been the shift from blue collar jobs to white collar ones. But in the early days of this shift an office job was still seen as being higher status, it didn't intrinsically need the same level of protections as those poor souls being exploited in the factories. 

Much has changed and computers have help to thin the ranks of office workers, replacing them with vast armies of IT workers, both on the development and operations side. If developers get it bad sometimes from crazy non-techies, operations people have it ten times worse. Together we're constructing vast piles of hopelessly tangled complexity. Most of which could have been avoided, if things weren't so disorganized and rushed. 

Learning from the past, pretty much the easiest thing we can do to change our destinies is to unionize. Not for pay or senoirity, but rather to protect ourselves from all of the bad people out there that want to compromise the work down to an unusable quality. If we did go this route, we could create a new IT union that espoused our values and that helped us enforce a code of good conduct for the things that we are creating. If we're being pushed to do bad things, we could appeal to the union to intervene on our behalf. A neutral third-party to arbitrate disputes. We'd have some way of preventing technical debt from swamping the projects. At minimum we'd have access to a broader perspective on what is really current practice, so we could tell right from wrong. Good practice from foolishness. We'd have some way of really standardizing things that wasn't controlled by specific vendors. 

The dark side would be insuring that control doesn't fall into the hands of the wrong people, either vendors or others with self-oriented agendas. We could mitigate against that, by voting for leadership and setting term limits to prevent people from settling in. Big companies would hate the ideal and activity work against it, but if we unite ourself in order to stabilize the profession and build better stuff, even the public would find it hard to be against the idea right now. No doubt the fiercest opposition would be within the ranks of programmers themselves. We're an uncontrollable herd of cats, setting off in every direction possible, so any attempt that is perceived to change that status quo is usually fought heavily. But the idea here is not to reign in coding practices, rather it is to clearly establish reasonable practices and then give the programmers some leverage with their stakeholders to follow those directions if desired. And to lend some type of real authority to inter-programmer disagreements. If one coder has gone way off the map, it would be useful for a project to be able to establish this with some certaincy before it is too late.

Overall I'm still not certain that we don't have other alternatives, but I'm not really aware of any at the moment (comments are welcome). As our societies rely more heavily on computers, the pressure to write crappier stuff has increased. This is a dangerous trend that I know we have to reverse if we really want to leverage our technologies. If I felt our industry was maturing, this type of remedy wouldn't be necessary, but there is little trustworthy evidence that we've improved and considerably more evidence that we've actually degenerated in the face of our mounting complexity. Right now I don't trust most software, and I generally find my self dancing around an endless series of annoying bugs just to get simple things done. I've lost my belief that computers are actually helping us. We need to fix this before it becomes catastrophic.