We are awash in an endless sea of data. Collecting it is easy. Storing it is gradually getting cheaper.
All of this information we gather serves us absolutely no purpose if we are unable to structure it and then summarize its meaning. With so much data, if we can't abstract our representation into some geometrically simple form of presentation, then we can never understand what it is that we have collected.
Crunching that much information is the job of a massive system. The type of behemoth that probably doesn't exist, but that most of us can image.
You can't build systems in that caliber by yourself. We can't even build them with a team. We have the technologies, but not the methodologies for this type of work. Large systems require large groups of developers to come together.
Directing a big group of people towards a successful end is not an easy task. When that task involves building something complicated with lots of dependencies, it becomes ever harder. Software, although virtual and seemingly 'lightweight' is plagued by a surprising amount of complexity. Hidden complexity that is usually underestimated.
By yourself, it may be easy to conceive of a simple design for some software program and over a few months or even a few years belt out the implementation. But the problem changes when you need to include other people, and it really changes when you include them right from the beginning.
To build big, you need a big group of people, there is no choice.
To get a group of programmers together, and to get them all building the same system, you need some form of control. You are unlikely -- for instance -- to get twenty great programmers; some of them will be more skilled than others. You are unlikely to get twenty mind readers; most of them require communication to direct them towards their work. You can't just fire the ones you don't like. The ones that don't measure up to your standards. You'll be left by yourself and you just aren't fast enough to build big systems.
So you need some way of aligning their goals towards a common vision. You need an architecture, and you need it in a form that is easily sharable. Blueprints. You need some type of blueprints that are:
- Simple, yet abstract.
- Small enough to understand.
- Unambiguous, so it gets done correctly.
It is not that taking input from all of the programmers is a bad thing, or should be discouraged, but that 'design by committee' rarely produces elegant results. Where it matters, at the various 'levels' in the design you want consistency, so you want the same designer's influence. Where it could be considered independent, you could have multiple designers.
The quality of the design depends on its underlying consistency. The quality of what you build depends on the number of 'cooks'. Too many cooks spoil the broth.
That is why for many buildings -- at least where they actually care about appearance -- there is an architect. It is that one 'vision' that binds it all together, in a way that it is coherent.
The core of a blueprint is just enough information contained in an abstract sense that at least one person can cram it all into their mind and visualize it. The size of the abstraction is crucial, as you cannot easily validate it if it is too large.
A good architecture is an abstract design for the thing to be built, and its blueprints are the graphical (and textual) instantiation of that abstraction. They need just enough detail to go forward in an unambiguous manner. They needn't contain every little detail, but those details that are absent should be constrained in some way, for example by well known standards.
Having blueprints that you can share with the team is only the beginning. A good architecture does more than just iterate out the coding elements, it can also be used for:
- Iterative Development Support.
- Bug Reporting Triage.
In the architecture, you draw both horizontal and vertical 'lines'. The system is decomposed into layers and components. When these pieces are truly encapsulated from each other, you get a stability to the project. Choosing to upgrade to a more professional resource management layer for example, can be done all by itself with no effect on the rest of the code. For each iteration, you can pick another layer or component and upgrade its depth and scope; growing the system piece by piece. If you know where you want to go in the future, this is a reliable way of getting there, one step at a time.
As well, with discreet well-understood 'lines', it becomes easier to quickly call the location of a bug. This attribute cuts down on support, as one person can accurately triage incoming bugs towards the right resources without having to get the whole team involved. A significant time saving.
The architectural lines effect testing as well, but I'll leave that as the subject of a follow up post. It is too complex to fit into this entry.
When you lay down the blueprints for an architecture you set a structure over the project. This type of framing gives stability, but it also allows for you to delegate responsibilities for the pieces to various sub-teams. If you've gone with a 'big ball of mud' for an architecture, the sub-teams will inevitably overlap with each other. That causes confusion and strife.
While we can envision huge systems capable of processing mass amounts of data, coordinating our efforts as a team towards a single development goal has always been a cultural problem for programmers. You can see the possibilities, but by not working together, all you get is frustration. Unlocking that potential comes from finding a way to communicate a consistent unambiguous design to a team of eager developers. We don't want to be mindless drones, but when that little extra freedom we crave threatens to derail the project, it also brings with it that horrible stress of possibly failing.
Blueprints that really work would alleviate many of the anxieties. If we get past this problem, then we can surf the sea of data and do some really interesting work, instead of reinventing the same simple applications over and over again.