Sunday, August 17, 2008

Social Disorder Revisited

It tanked. I'm not sure why I'm finding this so funny, generally if my posts don't do well at the various news sites, I, like most mildly narcissistic bloggers, feel somewhat depressed. This time however it's different.

The post in disgrace is about the analysis of the very thing in which it is now failing, news sites. This one even tanked on DZone, a place where I've often eeked out at least a couple of friendly positive votes. The sites implicitly ignoring a post about themselves is quite possibly an telling statement about their essential qualities.

Or is it? Less dramatically, it might just be because the title sucks, and my description was lame? Perhaps my timing was off? Who really knows? Or can you even really know? That's the subject of this follow-up post to my earlier one on social sites:

http://theprogrammersparadox.blogspot.com/2008/08/social-disorder.html

The sense that you can't really understand why this most recent post failed is, in all ways, the underlying essence of my point about analysis. You can have all of the facts readily at hand, yet still be surprised because there is some previously unknown variable that has come into play. I can look at the fact that the post didn't do well, but I don't actually know if the problem is packaging, content or timing. I can guess, but I am just guessing.

The inherent nature of people in organizations is such that it is often irrational, unpredictable. Facts can help you construct a model, but the conclusions you draw from those facts can vary widely.

A lot is written about the issues or easy solutions, but if it isn't based on the underlying nature being non-deterministic, then it is unlikely to be usable; wishful thinking more than serious analysis. Predicting the future is hard, predicting chaotic systems is nearly impossible. People don't seem to realize when they are basing things on facts or basing them on predictions. Understanding that distinction is vital.

For this post I just wanted to fall back onto some seemingly disconnected points about analysis that I missed in my original writings. Small things, but relevant. I've tied most of these back to other people or their statements, it's just easier that way. Everything needs a structure, even if it seems arbitrary (which it never is).


STEVEY

In one of his recent posts, Steve Yegge talks about gathering requirements being too little, too late:

http://steve-yegge.blogspot.com/2008/08/business-requirements-are-bullshit.html

He's right, I think, but for the wrong reasons. Analysis, and programming are nearly quantifiable pursuits, they are things that are founded around transferable skills. Not that everyone can do them, but if a person can handle it, you can teach them to do it, even if it takes some time. It might be hard to explain, at times, but it is explainable. There is a method to the madness.

But even with all the analysis in the world, it still doesn't help you envision the right tool for the job. You may know what the users are doing, and what they need, but you'll never know the side-effects of switching to a new tool until they try it.

In that, there are millions of variables, but we ignore most of them, and focus on the less than twenty that out minds can cope with. The significance of the things we don't know is not possible to gauge, but hopefully it's not huge.

There is something more there that supersedes the raw facts of the matter. Understanding how sites work doesn't change the reaction to the content of a post. The system, and the individual behavior at some point are not the same thing. You may know the rules, but not understand when and why they are applied.

More to the point, like a lot of things in life, if it were easy to predict what would work, more people would do it, and that would disrupt the predictions. Some volatile systems consistently maintain their volatility as an intrinsic property, one that is constantly changing. Like many complex systems, such as stock markets, prediction is impossible because the existence of the prediction disrupts the system causing it to change.

Steve's point about the requirements is close to the mark; you are using the requirements to drive the analysis, which isn't going to find anything because you don't know where to look. It's a needle in a haystack search. In his description, people are actually looking for meta-requirements, not the base ones.

Analysis, one type of output of which is requirements, is how you go about validating your understanding. But you have to have that understanding first, in order to direct the analysis in some meaningful direction, otherwise you're just randomly searching.

If you do know what you are going to build, then gathering requirements is a reasonable next step. It's not that the "requirements" are the problem, it's where and when in the process you are looking for them that is off. I'll get back to that a little later.


JOHN

While discussing my original post in email, a friend of mine John Siegrist asked "Would anyone want to use a system that couldn't be gamed?" Which, in its very essence is a deep fundamental question.

This question examines the system as well as the consumers and producers using it. The weakness of the current systems allows people to game them, but is that an essential quality in drawing in producers, many of whom involved in this type of site are amateurs. Does this give them the incentive to publish? Is the game an essential quality?

My guess was that the early adopters looking at these type of systems are drawn to the gaming aspects, but the later ones, who possibly still haven't discover them yet, will use them more as tools of convenience.

As the market matures, a different crowd of people will get drawn in, and their behavior and expectations will be significantly different. Strangely, while the producers later on will be more conservative, I suspect that the consumers will be more forgiving.

Any which way, looking at all of the back-door features and communication methods, gaming the system is currently an important part of the "whole" system functionality.


PHIL

My friend and classmate Philip Haine has an interesting view of analysis and design in his blog:

http://stealthisidea.com/articles/design-pyramid

While I like his perspective, I think that we could improve on it by making it a bit simpler; possibly just vision -> analysis -> design as the 'three' layers in a pyramid. I see his use of "understanding" as just a side-effect of analysis (with a mix of vision) and I see "requirements" as the way analysis is documented, making these two faces of the same coin.

Following from what Steve Yegge says, you have to know where you are mostly going first before you start to do the analysis to back up your belief and assumption. Requirements can't take you there, they can only confirm what you know (or don't). In that sense, "understanding" in Phil's diagram is really partly "vision". You need to know in what direction your headed, before you can travel to get there.

Once you know what you are going to build, then you have to nail down the specifics. A set of requirements is just one of many different forms of documenting the analysis. We don't need requirements per say, but we need the analysis to be tangible (although some small projects just leave the results in the programmer's head).

What makes this all so interesting is that "vision" is not a quantifiable or teachable skill. You can teach analysis, and you can teach programming, but vision is really just being able to predict the future. Not only can you not teach it, but as it is more or less based on some strange cross between instinct and luck.

It is not a repeatable skill either. It's not uncommon to see someone get it correct one day, only to miss my a mile on another. It's not all luck, but being lucky helps.


NICK

On another strangely related note, Nicholas Carr was wondering if we are getting stupider:

http://www.roughtype.com/archives/2008/08/is_google_makin.php

The fast access to junk info-lets, leads to a steady diet of fast knowledge. This type of diet does little to enhance the strength of one's thinking, and a lot to make it sluggish.

This also fits into the overall theme because the news sites are the magnets for the dispensation of information, and the new way to do that is to make it entertaining. Quality of content is not really as important as getting a good title, or easily understandable platitudes. Consequently, our choices in this world are getting made based on poorer and poorer models of our surrounding circumstances. We've sacrificing quality for instant popularity.

The news sites are transforming the way we see information. what was once significant and very carefully constructed is now just whipped out. The freedom of the masses in publishing means the degenerating of the publications. Overall, we trade quantity for quality.


ME

I wrap this up in my being a producer, and being able, at the amateur level to easily publish my ideas, along with millions of other people. In the past I would have either needed a scholarly journal, or a magazine, to get noticed. Now my crazy ideas are getting out there easily.

The problem is that I want to be more than info-tainment. I think that my ideas are fairly strong, and I dislike the idea that the medium is making them wishy-washy.

I suspect that maybe some of Reg Braithwaite's decision to quite blogging may have been partly driven by the futility of publishing in this modern age. He implied that he had reached a point where he had nothing left to say. But if we're not talking, is anybody going backwards to the "old" posts to read them again? Do they just stop listening? Many of the news sites won't allow existing URLs to be reposted; few of them easily retain your reading history. Yesterday's news is old, not worth reading.

Yet, for myself, and I'm sure a large number of other bloggers, we have aspirations of filling our posts with more than timely tidbits. The things I say, rightly or wrongly are meant to stay around for a while.

Many other software development bloggers are the same. We truly mean to change the world, to take an immature discipline and inject some new vibrant ideas into the core. Once you're doing it long enough, software development is so frustrating because the true progress is so little. But for all of the effort that people are putting into their blogs, the masses are passing us by, or just reading the works to help them smile. It's hard to judge the effects of my efforts.

That's the irony of my last post, its title was unlikely to be 'entertaining' enough to have survived in the various sites. A premature death to a reasonable understanding of the phenomena that killed it. Funny that.


ANALYSIS REVISITED

Given what I said about the unpredictability of the future, it may seem to be impossible to apply analysis, and use it in a practical way to succeed. It is essentially random after all. But oddly, that is not the case.

To see if something works the only way to know is to try it. It depends on luck. But I love the expression "hard work generates luck" because it is true, and very true in this case.

You can't tell if some thing is going to work just by thinking about it; you can't correctly account for all of the variables. But that doesn't mean you can't devise an algorithm to walk through the whole thing and find success. The only thing you can't cope with is how long it will take.

If you set forth some metric, say sales for example, and then float a simple inexpensive version of the tool, you can determine on a small scale if there is demand. Accounting for it, slightly you can difference the actual normal growth from any new product changes, to see if they are helping or hurting. The different "causes" blend together, but they are still there, and measurable.

In a less deterministic way this is what entrepreneurs do all of the time. The philosophy is to try a million things, see what works. Sometimes you just have to keep testing it over and over again until you blunder into an intuitive sense that it is the right direction. There is no quantifiable skill here, it is a just luck, instinct and some strange rare ability to take a huge number of variables, ignore the right ones, and tweak the remaining ones. This adherent form of creativity, since it involves creativity ignoring reality, is not teachable or trainable. Some people focus in on the right aspects, most people confuse or blind themselves.

And it is worth noting that analysis, if it is not based on emotion, prediction, instinct, etc. is just a way of applying a structure over a large series of facts and observations. Once you've found a problem and analyzed it, you can make a few predictions about it's future behavior and use all of that to create a tool that should, mostly, help users solve their problems.

And most real problems are neither that hard, that different, nor that volatile, so a little experience in the same domain goes a long way towards being able to make better predictions.

In a simple analogy, analysis is like searching for a specific point in a vary large field. The larger the area, the harder the search and the less likely you are to find what you are looking for. Shrink the area and the odds of the search get considerably better. In software you predicate the type of solution you are looking for, and then use analysis as the search to find it. On a good day, if you stick with it, you'll probably get lucky. But only if you're reasonable.

4 comments:

  1. Hi, Paul!

    Thanks for your comments. Nice to be conversing with you again after so many years.

    Just to clarify, the Design Pyramid establishes what elements a design process needs to account for. It doesn't prescribe a specific process per se. The layers of the Pyramid are nouns, not verbs.

    Process-wise, I agree with your point that you have to start somewhere, with a hypothesis of what might be a worthy problem to solve. This is the initial vision. It doesn't come out of thin air, but is inspired by some prior understanding.

    Smart organizations then conduct research and analysis to validate and refine the vision. This activity corresponds to the Understanding layer below the Vision layer.

    (Less smart organizations skip this step, plunge forward solving the wrong problem, and get into trouble down the road.)

    A properly refined vision is easily translated into specific requirements that serve as design criteria.

    The sequence in this case is initial vision -> research & analysis -> refine vision -> translate into requirements -> design

    Nowadays, with the costs of prototyping so low, it's becoming common to jump rapidly between the layers of the Design Pyramid. You might start with a rough idea (Vision level) talk to a few potential customers to make sure you're sane (Understanding level), whip up a quick & dirty prototype (Design level) and use that to test not just the design but the concept itself (back down at the Understanding level).

    In this scenario the sequence is: initial vision -> research & analysis -> a bit of design & prototyping -> try it with users to test both the design and the vision -> refine the vision and design -> rinse, repeat.

    The key lessons of the Design Pyramid model are: 1. the vision is its own thing, separate from the design 2. you need to get the vision right otherwise the design is doomed and 3. the vision can only be only as good the understanding upon which it rests, so you'd better get good understanding!

    Best,
    - Philip

    ReplyDelete
  2. hey, i enjoyed your social disorder piece. i only got to it today, after it had sat for a week in an unread tab (linked from HN).

    possible reasons it didn't get lots of points:

    1) it was not immediately obvious what it was about
    2) it was long

    ReplyDelete
  3. @Philip,

    Thanks for the reply, I hope life is still going well down south :-)

    Separating out vision from analysis seems to be the core point. What I find interesting is Steve Yegge implying that vision is localized, that is, you really don't have it unless its based on personal experience. You can't just poke your head into another domain and get lucky. I'm not sure I agree with that, but vision is definitely a very rare skill set,indeed.

    @timb,

    Thanks for the comments. Yes, it does seem that poor titles and long text make it harder to get an audience. Still, I resist calling my post "4 points you absolutely must know about social networks" and then sticking in five bullet points because I am hoping to get beyond just being info-tainment. A victim of my own hubris? For sure, but at least my eyes are open to my own self-inflicted behavior. Although a mansion might actually be nice?


    Paul.

    ReplyDelete
  4. :you really don't have it unless its based on personal experience."

    There is no doubt that the closer we already are to the domain, the easier it is to provide vision. But people pay me to poke into their job and come up with visions, so I think it is possible.

    I suppose the central trick/skill/talent is to invoke empathy -- a simulation of the experience of others. There are tools and processes to help with that...

    - Philip

    ReplyDelete

Thanks for the Feedback!