While there is a huge range in opinions, I think most software architects would agree that their position is primarily about defining broad strokes for the development of computer systems. Laying down a master plan, or an overview of some type. The development then happens by one or more teams of programmers.
The fastest way to pound out a big software system would be to lay out the instructions in the largest possible sets. Reusing highly repetitive sections helps in speed, but beyond that, adding structure is actually more work. Coding a big system into thousands of small functions is a significant effort, regardless of what's actually being built.
However, we break up the code into smaller pieces because it makes it easier to extend the system later. Most software development has shifted away from the idea that it is a huge one-shot deal, and accepted the fact that software is continuous iterations for the life-time of the code. In this case, the big problem is not writing the first version, it is all of the work happening over years and years to move that version forward into various different incarnations of itself.
VERTICAL AND HORIZONTAL LINES
There are two big things that effect the development of software: the technology and the problem domain (business logic). Philosophically, the two lay themselves out perpendicular to one another.
Domain logic problems are vertical. The user needs some functionality which cuts through the system, and ultimately results in some changes to some underlying data. It's a thin line from the user to the data, and then back to the user again. The system implements an update feature, for example, or a report or some other specific set of functions constrained by a specific set of data that the user (or system) trigger.
Technological problems are horizontal. The same problems repeat themselves over and over again across the whole system, no matter which functionality is being used. All Web applications, for example have similar problems. All distributed systems need the same type of care and feeding. Relational databases, work in in a similar manner. The problems are all unrelated to the functionality (or at least unattached to it) . For example all systems that use transactions in relational databases have the same basic consistency problems.
CONSISTENCY AND ONIONS
Easily, the greatest problem that most big systems have is consistency. It's not uncommon to see large software products where each sub-section in the system bares little resemblance to the others around it.
Aside from just looking messy, it makes for a poor user experience because it is harder to guess or anticipate how to utilize the system. If everything is different, then you have to learn it all in detail, rather then just being able to grok the basics and navigate around the system easily.
In small systems, if there was only one programmer, often the functionality, interface, data, etc. is consistent as a consequence. It's one of those early skills that good programmers learn. Picking a dozen different ways to do things simply makes it harder to maintain the code, and the user's hate it. A messy system is an unpopular one. A pretty, consistent one, even if it has bugs, is always appreciated.
Ideally, if you were deploying programming resources, the best approach would be to assign individual programmers to each section that needs to be consistent. An easy way to do this is by arranging them in an onion-like structure. A series of containing layers, each one fully enclosing the others.
In this approach you would have a database programmer create one big consistent schema, and all of the database triggers and logic. Another programmer would be responsible for getting that model out of it's persistent state and into a suitable form usable by the application, and perhaps augmenting that with some bigger calculations. The application/interface programmer would then find consistent ways to tie user elements to functionality. In this scenario, the inconsistencies between the styles of the different programmers would mostly go unnoticed by the users.
It's probably because of the consistency that most of the really big systems out there, actually started as smaller projects. Growing something into a bigger size, while mostly maintaining the consistency, is far easier than trying to enforce it initially. Big projects tend to fail.
STRUCTURE AND TEAMS
Deploying big teams for large software projects is a huge problem.
The domain logic runs through the system vertically. If you're building a large system, and you partition the work out to various groups based on which parts of the subsystem they'll develop, it is a vertical partition. Of course, we know that arrangement will generally result in massive inconsistencies along the functionality lines. It practically guarantees inconsistencies.
If you orient the developers horizontally, there are two main problems. Generally technology is piled high for most systems, so it means that each horizontal group is dependent on the one below, and effects the one above. The work cannot start all at the same time, it has to be staggered, or excess useless work gets done. In general, you tend to have teams standing around, waiting for one another, or building the wrong things.
The other big problem is that the developers are experts on, and focus on the technology, so they tend not to see the obvious problems with the domain data. That can lead to significant effort going off into useless directions that have to be scrapped. Developers that don't understand the domain always write the most convenient code from their perspective. That is, unless they really have a great grip on real domain problems -- which makes for some brutally complex coding issues -- they will choose to interpret the problem in the fashion that is the most convenient for making the code simple, not the functionality. This leads to functional, but highly convoluted systems.
The choice for dividing up the teams, along either set of lines, is a choice that heavily effects the architecture of the system. One is tied to the other.
MANAGEMENT, TECHNOLOGY AND BUSINESS
An architect's main job is to lay down a set of broad strokes for a group of developers or an enterprise. Horizontally or vertically, these strokes must match existing technological or domain logic lines. That is, the architects don't really create the lines, they just choose to highlight and follow a specific subset of them. If they don't and choose to make their own lines, it is assured that the cross-over points will contain significant errors and often become sinks for wasted effort.
Architects, then, have to both have strong understanding of the underlying technologies and strong understanding of the user domains. You cannot architect a system for a bank, for example, if you do not have both a strong understanding of finance, and most of the technologies used in the system.
I've certainly seem millions of dollars get flushed away because the architects didn't understand the technology, or because they just didn't get the real underlying problems in the business domain. Guessing in either corner is a serious flaw.
Often, in practice, older architects will have great and vast business knowledge, but are weak on the the more modern technologies. That's not uncommon, and it doesn't have to be fatal. Two architects can combine forces, one vertical, the other horizontal, if they are both wise enough to know not to step on each other's turf. Nether issues really takes precedence, they both are critical.
The biggest thing to understand is that the arrangement, and partition of the work into various teams, is fundamentally an architectural issue. Management of the teams, and how they interact, which is reflected in the code, is also an architectural issue. So, an architect inheriting four or five cranky, unrelated development teams, needs to be very wary of how they are deployed.
APPROACHING CONSISTENCY
The simplest way to build a big system is to start small and grow it. Careful extensions, backed by lots of refactoring and cleanup, can keep the code neat along both the horizontal and vertical lines, but it is a lot of work to keep up with it.
In time, when it needs more effort, or if the original project is time-dependent, organizing big development teams will directly effect the code-base. In a very large system, it is just not possible or practical to have one and only one programmer spanning all of the things that are needed for consistency. But consistency is still critical to success.
Like water or electricity, people always flow towards the easiest path. It is just that different people have widely different ideas about what easy really means.
In general, for most of humanity, it is easier to be inconsistent, than not. That's why, for example, building construction has a large number of different standards that have to be followed. But just standardizing something itself is not enough.
Not reading or following a standard is far easier than doing it right. So, there needs to be some extra level of incentive given to push people into abiding by the rules. The only way that really works is by separating the concerns, and giving space for someone to be an expert in the consistency of a very small domain. I.e. if you want people to follow some consistent subset of rules, you need to make an inspector (with the power of enforcement) to look after and enforce that consistency.
This, then, brings the issue back into the domain of a single person, although their actual depth in the work is very light.
So, what I am getting at, is that in a very large project, one with a couple of architects, and several teams of developers: to enforce consistency you also need teams of inspectors. Each individual is responsible for a small subset of rules, but one in which they can apply their efforts consistently. Where the architects lay down the broad strokes, the inspectors examine various pieces at appropriate times in the development, and point out infringements. The software is ready when it has reached a suitably low infringement state.
If you just have an architect laying out a grand scheme, it's unlikely to get followed closely enough to be able to justify the work. Normally, no matter how it starts, over time the meta-team structure disintegrates, and the various groups shift their focus away from commonality and towards just doing what needs to be done. From each of their individual perspectives, it seems like a reasonable approach, but for the project as a whole, it is a fatal step towards a death march. The breakdown in development structure, mirrored by a similar one in the architecture, opens up gaping holes that become endless time sinks.
Besides consistency, the inspectors also become the force that tries to stop each team from solving their own little time sinks by throwing them at the other teams. If the lines are clear, then there shouldn't be any political issues about where the problems lay. If the lines are blurry, as things get worse, everything becomes political.
AND FINALLY
There are a lot of software architects out there that grew up in the ranks of programmers. From this view, they want architecture to be primarily focus on either technology or domain issues, depending on their own personal background. The big problem with that desire is that as just a meta-programmer (a coder at a higher level), without taking into account the environment, a huge chunk of the possibility of success or failure rests on someone else's shoulders. If you leave all of the personal arrangement, politics and other development dynamics to an unrelated set of managers, you are unable to control the huge effect on the overall work.
The environment will build things in, effects how we build them. It can also be the biggest factor between success and failure. If these things represent big issues that can sway the results of an architecture, then the architect needs to be in control of them. If you can't break up and restructure the development teams as needed, then you're not really in control of a major aspect of the work are you?
Although the architects level is higher and more generalized, they do need the possibility of exerting total control over any and all aspects that could derail their efforts. That includes management and politics.
Software is a static list of instructions, which we are constantly changing.
Sunday, May 31, 2009
Friday, May 22, 2009
Driven by Work
Recently I returned from spending several weeks backpacking in China. I stomped on the Great Wall, gawked at the Terracotta warriors and even managed to plunge lightly into some of the mysteries of Tibet. All great locations in a land filled with a long and complex history.
Traveling always renews my excitement for life and reawakens my sense of curiosity. I find that if I'm stuck too long without some type of grand adventure, I tend towards pessimism. I think getting caught in a rut just clouds my mood.
Every so often I need to break free of the constraints of modern living and lead a more nomadic existence. I need to get away from the routine, and react more dynamically to the world around me. I just need to break free of all those bits of life that just keep piling up around us. I need to get back into the real world, not just my little corner of it.
LANGUAGE AND OCCUPATION
There are a lot of theories, such as the Sapir-Whorf hypothesis, that suggest that languages influence how we think and see the world around us. For many modern languages, it may seem to some to be a stretch, but I seem to remember a reference to an ancient Chinese "water" language with only 400 written characters (although I couldn't find an Internet reference, so you'll have to trust my memory). With such a limited vocabulary it would certainly be difficult, if not impossible, to craft an effective rant on most subjects. The words just aren't available. If you don't have the words, you can't express things easily. Even if you could create long complex sentences; the lack of brevity starts to be become an impediment to the ideas.
That's probably why so many professions end up with their own huge technical dictionaries. Short precise terms that have large significant meaning. Words that encapsulate complex ideas. We always need to be able to communicate larger ideas with less bandwidth.
More obvious than language, one's occupation surely has an effect on how we perceive the world around us. You can't spend 40 to 60 hours a week engaged in something, year after year and not expect it to affect you in some way.
Lawyers probably argue more than most, doctors seem preoccupied with their health (although I know several that smoke and drink) and accountants tend to pinch pennies. If you keep looking at the same problems in the same ways, it's hard to prevent that from spilling over into other aspects of your life.
I'm sure everyone will have counter examples where they know of someone that breaks the mold -- there are always exceptions -- but I think it's still pretty likely that the way we live our lives directly affects how we see the world around us.
It certainly shows up heavily with programmers and other techies. You see it clearly in their ideas and interactions. The web is plagued with examples.
THREE CHANGES
Over the years I've noticed several changes in the way I see things. Some changes are purely a result of age or education, but there are definitely some that have come directly from how I am spending huge chunks of my time. Influences from a lifetime of pounding out software.
Three changes in specific, appear over and over again. I've become more detail-oriented, I tend to view the world more often in a black and white perspective, and I'm often more disappointed when things are not easily deterministic.
It makes sense. Each of these attributes helps significantly in developing software.
The key to getting any big project designed, coded and into production is to make sure all of the details are strictly and carefully attend to. No matter how you look at the big picture, the details always either act as time sinks or have to be carefully managed.
Also, as a software developer I always have to tightly constrain the world around me into a strict, yet limited black and white perspective. Dynamic shades of grey make for horribly inconsistent software.
And I've learned to stick to the things that I know are strictly deterministic in nature. If it is going to work, it needs to work every time, consistently. A program that works sometimes is all but useless.
Spending my days and often nights in pursuit of big functioning systems gradually takes a toll on the way I see the world. On the way I want to interact with it. For each new domain I enter, and each new product I create, I have to break everything down into relatively simple, straight-forward chunks.
While these changes have helped me build bigger and better things in my career, I've often mistakenly applied the same back to the world around me, to a negative effect.
Software straddles the fence between the purity of mathematics and the messiness of the real world. The real world doesn't work the same way that a computable one does. You can't debug a personal relationship for example, and some things such as weather are just too chaotic in nature to be predicable.
While often we can understand the underlying rules, that doesn't mean we understand the results. You can see how the stock market works for example, but still be unable to profit from it. Those attributes that help me create programs also set me at odds with the world around me. The harder I work at it, the more they seem to influence my perspective.
I think that's why traveling restores my sense of balance, and reminds me that not everything should be simple, consistent, rational or even clear. It's often a breath of fresh air, in an otherwise stuffy environment. A subtle reminder to not get too caught up in myself.
I do notice that besides myself, often others in the community are highly afflicted by a computationally-constrained perspective as well. One that probably makes their lives more difficult. There are a huge number of examples, but the web itself becomes a great historian for studying people who have become far too rigid.
THE DEVILS IN THE DETAILS
While the small details are very important in getting large complex projects to work, being too attentive to them can lead to missing the big picture. Given the choice, getting the broad strokes wrong is far worse than missing any of the details. In the later case, the distance to fix the problem is often far shorter.
Focusing too hard on too many small things leads to a tunnel vision. A perspective of a world were the importance of nearly trivial details are often thrown out of proportion.
It's not uncommon, for example, to see techies highly irritated by spelling mistakes and typos.
Language is the container for ideas, and while good spelling and grammar make reading easier, it is rare that they introduce real ambiguities into the text. As such, they are really minor annoyances, yet some people seem so put off by even the simplest of them.
A well-edited piece, quite truly, is easier to read, yet often it is symptomatic of the work being less raw and more contrived. A polished effort. You'd expect good writing in a magazine for example, but you always know that the polishing 'twists' the content. You are not getting the real thoughts of the author, but instead you're getting their packaged version.
Ultimately, what's important is the substance of what people say, not how they say it. Missing that, leads to dismissing possibly correct ideas for the wrong reasons. The truth is true no matter how badly it is packaged, while falsehoods tend towards better packaging to make them more appetizing.
BLACK AND WHITE
You can't just toss anything into a computer and expect it too work. Computers help users compile data, which can be used to automate work, but ultimately everything in a computer is a shadow of the real world. A symbolic projection into a mathematical space.
We can strive for a lot of complexity, but we can never match the real world in depth, nor would we want to. To make our systems work, we have to reduce a world full of ambiguity and grey into a simplified black and white version. If it's not easily explainable, then it's hardly usable.
This can be a huge problem.
Too often you see programmers or developers trying to pound the real world into overtly simplistic boxes of good and bad. Right or wrong. Left or right.
That type of poor deconstruction, driven by a Star Wars mentality, leads to many of the stupid flame wars that are so popular were two sides pound on each other assuming the other is wrong.
You know: my programming language is good, yours is bad. My OS is better than yours. My hardware is right, yours is inferior. That type of nonsense.
Clearly, all technologies have their good and bad points, but that quickly becomes ignored in a black and white world. Everything get reduced into two overly simplified categories, whether or not it makes any sense. A stiff rigid viewpoint fits well inside of a machine, but isn't even close to working with the world outside.
If you're oriented, for example, to see all of your fellow employees as only either good or bad, then because of that limited perspective you are missing out on a broad (and ultimately more successful) view of the world around you. People have such a wide range of strengths and weaknesses, that assigning them to one list or the other misses out on their potential. You'll end up relying on weak people at the wrong times, while passing up some well-suited resources.
Black and white works well for Hollywood or comic book plots, but it forces us to miss the depth of the world around us.
A LACK OF DETERMINISM
A programmer lays out a pre-determined series of steps for a computer to follow. Even with today's overly complex technologies, the computer still precisely follows these instructions. It does exactly what it is told to do. It does so in a deterministic and predictable manner. It is logical, and rational in it's behavior.
We get used to this behavior from the machines, so it's not uncommon that programmers start to expect this from other parts of the real world as well.
This becomes most obvious when you see technical people discussing business.
Management consultants lay out huge numbers of theories for business to follow, but hopefully they don't really believe that it works out that simply. Business, like weather, has some simple driving factors underneath, but at the practical level it is chaotic enough that it should be considered irrational.
If there existed some simple rules on how to succeed in business, eventually enough smart people would figure them out, to a significant enough degree that their exploiting the behavior would change the system. That is, if to many people know how to exceed the average, then they become the new average. If everybody wins, then nobody wins.
The markets are intrisically irrational (from a distant perspective) yet that doesn't stop techies from apply bad logic to predict or explain their behavior. It's epidemic, examples of people explaining why a superior technology will dominate the market, how being nice to the customers will increase business, how careful listening will get the requirements or any other assumption that presumes that the underlying behavior is deterministic, logical or predictable.
This works well for dealing with the internals of software systems, but for business and politics most programmers would do well to except that they are just irrational and unpredictable.
SUMMARY
If you know that you're biased, it is far easier to compensate for it. However, if you're walking around in an imaginary world, you tend to find that it's an uphill fight in the real one. You keep bumping into invisible walls.
The best way around this is to try very hard to separate these two worlds into two distinct perspectives. We don't have to unify our working perspective with the real one, but we do have to be aware when one is corrupting the other. We can keep them separate, and have separate expectations for each. They don't need to be combined.
When I travel, it reminds me of the greyness, the people and the irrationality of the world around me. Although I can often break things down into simple bits for the system development work, I always need to be reminded that that is just a mere subset of the reality surrounding me.
Computers should be predictable devices that are easily explainable, at least at a high usability level. The real world, outside of the machines, on the other hand, is an infinitely deep messy collection of exceptions to each and every rule. If you think you've figured it out, then you haven't gone deep enough yet.
It brings nothing but pain and frustration to expect that the real world will work on the same principles as a computational one, but still it's very common to see people caught in this mental trap. It certainly is something worth avoiding. Even if you have to occasionally trek around on the other side of the planet to do it.
Traveling always renews my excitement for life and reawakens my sense of curiosity. I find that if I'm stuck too long without some type of grand adventure, I tend towards pessimism. I think getting caught in a rut just clouds my mood.
Every so often I need to break free of the constraints of modern living and lead a more nomadic existence. I need to get away from the routine, and react more dynamically to the world around me. I just need to break free of all those bits of life that just keep piling up around us. I need to get back into the real world, not just my little corner of it.
LANGUAGE AND OCCUPATION
There are a lot of theories, such as the Sapir-Whorf hypothesis, that suggest that languages influence how we think and see the world around us. For many modern languages, it may seem to some to be a stretch, but I seem to remember a reference to an ancient Chinese "water" language with only 400 written characters (although I couldn't find an Internet reference, so you'll have to trust my memory). With such a limited vocabulary it would certainly be difficult, if not impossible, to craft an effective rant on most subjects. The words just aren't available. If you don't have the words, you can't express things easily. Even if you could create long complex sentences; the lack of brevity starts to be become an impediment to the ideas.
That's probably why so many professions end up with their own huge technical dictionaries. Short precise terms that have large significant meaning. Words that encapsulate complex ideas. We always need to be able to communicate larger ideas with less bandwidth.
More obvious than language, one's occupation surely has an effect on how we perceive the world around us. You can't spend 40 to 60 hours a week engaged in something, year after year and not expect it to affect you in some way.
Lawyers probably argue more than most, doctors seem preoccupied with their health (although I know several that smoke and drink) and accountants tend to pinch pennies. If you keep looking at the same problems in the same ways, it's hard to prevent that from spilling over into other aspects of your life.
I'm sure everyone will have counter examples where they know of someone that breaks the mold -- there are always exceptions -- but I think it's still pretty likely that the way we live our lives directly affects how we see the world around us.
It certainly shows up heavily with programmers and other techies. You see it clearly in their ideas and interactions. The web is plagued with examples.
THREE CHANGES
Over the years I've noticed several changes in the way I see things. Some changes are purely a result of age or education, but there are definitely some that have come directly from how I am spending huge chunks of my time. Influences from a lifetime of pounding out software.
Three changes in specific, appear over and over again. I've become more detail-oriented, I tend to view the world more often in a black and white perspective, and I'm often more disappointed when things are not easily deterministic.
It makes sense. Each of these attributes helps significantly in developing software.
The key to getting any big project designed, coded and into production is to make sure all of the details are strictly and carefully attend to. No matter how you look at the big picture, the details always either act as time sinks or have to be carefully managed.
Also, as a software developer I always have to tightly constrain the world around me into a strict, yet limited black and white perspective. Dynamic shades of grey make for horribly inconsistent software.
And I've learned to stick to the things that I know are strictly deterministic in nature. If it is going to work, it needs to work every time, consistently. A program that works sometimes is all but useless.
Spending my days and often nights in pursuit of big functioning systems gradually takes a toll on the way I see the world. On the way I want to interact with it. For each new domain I enter, and each new product I create, I have to break everything down into relatively simple, straight-forward chunks.
While these changes have helped me build bigger and better things in my career, I've often mistakenly applied the same back to the world around me, to a negative effect.
Software straddles the fence between the purity of mathematics and the messiness of the real world. The real world doesn't work the same way that a computable one does. You can't debug a personal relationship for example, and some things such as weather are just too chaotic in nature to be predicable.
While often we can understand the underlying rules, that doesn't mean we understand the results. You can see how the stock market works for example, but still be unable to profit from it. Those attributes that help me create programs also set me at odds with the world around me. The harder I work at it, the more they seem to influence my perspective.
I think that's why traveling restores my sense of balance, and reminds me that not everything should be simple, consistent, rational or even clear. It's often a breath of fresh air, in an otherwise stuffy environment. A subtle reminder to not get too caught up in myself.
I do notice that besides myself, often others in the community are highly afflicted by a computationally-constrained perspective as well. One that probably makes their lives more difficult. There are a huge number of examples, but the web itself becomes a great historian for studying people who have become far too rigid.
THE DEVILS IN THE DETAILS
While the small details are very important in getting large complex projects to work, being too attentive to them can lead to missing the big picture. Given the choice, getting the broad strokes wrong is far worse than missing any of the details. In the later case, the distance to fix the problem is often far shorter.
Focusing too hard on too many small things leads to a tunnel vision. A perspective of a world were the importance of nearly trivial details are often thrown out of proportion.
It's not uncommon, for example, to see techies highly irritated by spelling mistakes and typos.
Language is the container for ideas, and while good spelling and grammar make reading easier, it is rare that they introduce real ambiguities into the text. As such, they are really minor annoyances, yet some people seem so put off by even the simplest of them.
A well-edited piece, quite truly, is easier to read, yet often it is symptomatic of the work being less raw and more contrived. A polished effort. You'd expect good writing in a magazine for example, but you always know that the polishing 'twists' the content. You are not getting the real thoughts of the author, but instead you're getting their packaged version.
Ultimately, what's important is the substance of what people say, not how they say it. Missing that, leads to dismissing possibly correct ideas for the wrong reasons. The truth is true no matter how badly it is packaged, while falsehoods tend towards better packaging to make them more appetizing.
BLACK AND WHITE
You can't just toss anything into a computer and expect it too work. Computers help users compile data, which can be used to automate work, but ultimately everything in a computer is a shadow of the real world. A symbolic projection into a mathematical space.
We can strive for a lot of complexity, but we can never match the real world in depth, nor would we want to. To make our systems work, we have to reduce a world full of ambiguity and grey into a simplified black and white version. If it's not easily explainable, then it's hardly usable.
This can be a huge problem.
Too often you see programmers or developers trying to pound the real world into overtly simplistic boxes of good and bad. Right or wrong. Left or right.
That type of poor deconstruction, driven by a Star Wars mentality, leads to many of the stupid flame wars that are so popular were two sides pound on each other assuming the other is wrong.
You know: my programming language is good, yours is bad. My OS is better than yours. My hardware is right, yours is inferior. That type of nonsense.
Clearly, all technologies have their good and bad points, but that quickly becomes ignored in a black and white world. Everything get reduced into two overly simplified categories, whether or not it makes any sense. A stiff rigid viewpoint fits well inside of a machine, but isn't even close to working with the world outside.
If you're oriented, for example, to see all of your fellow employees as only either good or bad, then because of that limited perspective you are missing out on a broad (and ultimately more successful) view of the world around you. People have such a wide range of strengths and weaknesses, that assigning them to one list or the other misses out on their potential. You'll end up relying on weak people at the wrong times, while passing up some well-suited resources.
Black and white works well for Hollywood or comic book plots, but it forces us to miss the depth of the world around us.
A LACK OF DETERMINISM
A programmer lays out a pre-determined series of steps for a computer to follow. Even with today's overly complex technologies, the computer still precisely follows these instructions. It does exactly what it is told to do. It does so in a deterministic and predictable manner. It is logical, and rational in it's behavior.
We get used to this behavior from the machines, so it's not uncommon that programmers start to expect this from other parts of the real world as well.
This becomes most obvious when you see technical people discussing business.
Management consultants lay out huge numbers of theories for business to follow, but hopefully they don't really believe that it works out that simply. Business, like weather, has some simple driving factors underneath, but at the practical level it is chaotic enough that it should be considered irrational.
If there existed some simple rules on how to succeed in business, eventually enough smart people would figure them out, to a significant enough degree that their exploiting the behavior would change the system. That is, if to many people know how to exceed the average, then they become the new average. If everybody wins, then nobody wins.
The markets are intrisically irrational (from a distant perspective) yet that doesn't stop techies from apply bad logic to predict or explain their behavior. It's epidemic, examples of people explaining why a superior technology will dominate the market, how being nice to the customers will increase business, how careful listening will get the requirements or any other assumption that presumes that the underlying behavior is deterministic, logical or predictable.
This works well for dealing with the internals of software systems, but for business and politics most programmers would do well to except that they are just irrational and unpredictable.
SUMMARY
If you know that you're biased, it is far easier to compensate for it. However, if you're walking around in an imaginary world, you tend to find that it's an uphill fight in the real one. You keep bumping into invisible walls.
The best way around this is to try very hard to separate these two worlds into two distinct perspectives. We don't have to unify our working perspective with the real one, but we do have to be aware when one is corrupting the other. We can keep them separate, and have separate expectations for each. They don't need to be combined.
When I travel, it reminds me of the greyness, the people and the irrationality of the world around me. Although I can often break things down into simple bits for the system development work, I always need to be reminded that that is just a mere subset of the reality surrounding me.
Computers should be predictable devices that are easily explainable, at least at a high usability level. The real world, outside of the machines, on the other hand, is an infinitely deep messy collection of exceptions to each and every rule. If you think you've figured it out, then you haven't gone deep enough yet.
It brings nothing but pain and frustration to expect that the real world will work on the same principles as a computational one, but still it's very common to see people caught in this mental trap. It certainly is something worth avoiding. Even if you have to occasionally trek around on the other side of the planet to do it.
Subscribe to:
Posts (Atom)