The strongest process for building software would be to focus on getting the job done efficiently while trying to maximize the quality of the software.
The two objectives are not independent, although it is easy to miss their connection. In its simplest terms, efficiency would be the least amount of work to get the software into production. Quality then might seem to be able to range freely.
However, at one end of the spectrum, if the quality is really bad, we do know that it sucks in tonnes of effort to just patch it up enough to barely work. So, really awful quality obviously kills efficiency.
At the other end though, it most often seems like quality rests on an exponential scale. That is, getting the software to be 1% better requires something like 10x more work. At some point, perfection is likely asymptotic, so dumping mass amounts of effort into just getting trivial increases in quality also doesn’t seem to be particularly efficient either.
Are they separated in the middle?
That too is unlikely in that quality seems to be an indirect byproduct of organization. That is, if the team is really well-organized, then their work will probably produce a rather consistent quality. If they are disorganized, then at times the quality will be poor, which drags down the quality of the whole. Disorganization seems to breed inefficiency.
From practical experience, mostly I’ve seen that rampant inefficiency results in consistently poor quality. They go hand in hand. A development team not doing the right things at the right time, will not accidentally produce the right product. And they certainly won’t do it efficiently.
How do you increase both?
One of the key problems with our industry is the assumption that ‘code’ is the only vital ingredient in software. That, if there was only more ‘code’ that would fix everything. Code, however, is just a manifestation of the collected understanding of the problem to solve. It’s the final product, but certainly not the only place were quality and efficiency are noticeable.
Software development has 5 distinct stages: it goes from analysis to design, then into coding and testing and finally, it is deployed. All five stages must function correctly for the effort to be efficient. Thus, even if the coding was fabulous, the software might still be crap.
Little or hasty analysis usually results in unexpected surprises. These both eat into efficiency, but also force short-cuts which drag down quality. If you don’t know what you are supposed to build or what data it will hold then you’ll end up just flailing around in circles hoping to get it right.
Designing a system is mostly organization. A good design lays out strong lines between the pieces and ensures that each line of code is located properly. That helps in construction and in testing. It’s easier to test well-written code, and it's inordinately cheaper to test code if you properly know the precise impact of any of the changes.
On top of that, you need to rely on the right technologies to satisfy any engineering constraints. If you pick the wrong technologies then no amount of effort will ever meet the specifications. That’s a key aspect of design.
If the operational requirements never made it into the analysis and design, the costs of supporting the system will be extraordinary, which will eventually eat away at the resources available for the other stages. A badly performing system will rapidly drag down its own development.
Realistically, a focus on efficiency, across all of the stages, is the best way to ensure maximum quality. There are other factors, of course, but unless the software development process is smoothly humming along, they aren’t going to make a significant impact. You have to get the base software production flow correct first, then focus on refining it to produce high-quality output. It’s not just about coding. Generating more bad code is never going to help.
No comments:
Post a Comment
Thanks for the Feedback!