Way
back when I was a co-op student, I got a job at a small company that
built analytics applications for statistics. My previous co-op employer
had laid off the whole division, so I had to find a new gig. There were
lots of choices back then, so I picked the company because I would get
experience programming in C. It seemed like a good idea at the time.
At
first the job was great. I got to work on an interesting little problem
for a product about to be released. But as soon as that dried up, the
trouble began.
One
of the senior statisticians had been asking for resources so he could
get a pet project done. He sat way back in a corner office, I hadn’t
really noticed him before. In hindsight the location of his office
should have been enough of a clue.
The
re-assignment started out OK. He had a bunch of little programs he
wanted written in C. Nothing complex, not too hard, and the only big
challenge for me was the using the language well enough. I had
experience in programming Pascal, but C was a little trickier.
My
problems started when he begin explaining his ‘big idea’. Variables, he
said, where really hard to create and name when coding. It takes time
to think up the names, and then you have to remember the names you
thought up. Instead, he proclaimed, the secret to great coding is to
declare a bunch of global arrays at the top of each program. One for
each data-type, e.g. int, float, double, char *, etc. Then all you have
to do is index these to get to the correct variable. That is, if you
have an integer variable that holds the number of iterations, all you
need to do is remember that it was your seventh variable and type in
‘integer[7]’ and you’ve got it. No need to remember some complex name,
you just need to know the type and when you created it. Simple.
He
didn’t really understand why I recoiled in horror. This was, of course,
the most absurd thing I’d ever heard in my short little career. I
understood just enough to know why this wasn’t just a step backwards,
but rather a quantum leap backwards. It was ‘off the charts’ crazy as
far as I was concerned.
I
tried to explain why this would render the code completely unreadable
(and probably full of bugs), but I was just a student programmer so he
dismissed it. I ignored his directive to implement it this way and wrote
the code normally, you know with real variable names and scoping as
tight as possible. But I wasn’t used to C and I used some of its rope to
hang myself (my code was worse than flaky). And so my failure
re-affirmed his belief that he was right. The situation grew worse.
Fortunately
I was saved by the co-op work term coming to an end. It was a horrific
experience, but on the bright side I did end up with C on my resume,
which lead to my next job (where I apprenticed with someone who actually
taught me how to code properly).
I
learned a lot of different things from that experience, and even after
more than two decades I’m still learning new things. But the most
important of them was how easily it was for someone to get it horribly
wrong. Although he had written far more code than I had at the time, he
hadn’t written nearly enough to really understand what he was doing. A
big idea is far away from actual understanding. But without the latter,
the former is unlikely to be reasonable. And unfortunately for us,
software development appears simple to those who haven’t plumbed its
depths. Thus we get a lot of big ideas …
No comments:
Post a Comment
Thanks for the Feedback!