Collateral Damage

We are all perfectionists. We want everything to be perfect before we share it. We are paralyzed by a need for perfection, and it keeps us from getting anything done.

We should be more like the big tech companies. Start with a big idea. Spend a little bit of time getting the basic idea formed. Then, release it. Get feedback on it. Refine it as you go along. Your customers will tell you what’s working and what’s not working. They will show you where the innovative and interesting pieces are, and you can devote more time and energy to that.

This guy uses the metaphor “Ready, Fire, Aim”:

To quote:

Most people know the phrase like “ready, aim, fire.” You get your gun, you figure out. You make sure it’s in the right spot, and then you fire. But I like to do “ready, fire, aim,” where it’s like I think of what it is I want to do, I put a couple hours… into it. I fire. And then after, I make adjustments.

Let’s set aside the horrific metaphor, and the irresponsibility of firing a weapon indiscriminately with the idea of maybe accidentally hitting the target. If we ever enact common sense firearms legislation, maybe one of the provisions can be that this guy can’t have a gun.

What if I’m doing this as a software designer? Using this philosophy, I write a basic application without spending very much time or thought on it. After all, doing some design and planning seems like actual work, and I don’t have the time or motivation for that. I send my half-baked application out into the world with a ton of marketing hype promising that it does the stuff I thought about but didn’t actually implement. If people start using it, they quickly notice that it sucks, and they tell me. Now, I try to patch some holes and make it a little better so it doesn’t suck so much.  I release an update, and I get more feedback, and we keep going through this cycle. The approach is to try to spend as little time and energy as possible, just enough to get people to use it without complaining so much. If there are problems with performance or security, we can just blame those on other hardware or software. If there are features that don’t work, we can just say it’s a project in active development. If there are are bugs or data gets corrupted, it must be a compatibility problem.

The result is software that barely works and has to be constantly updated. We’ll make the end user responsible for that too, and create a culture where people are afraid to not install updates. We’ll write a license agreement that disclaims any liability for the software or any damage it may cause.

I’m tired of being the collateral damage. I’m tired of beta testing everyone’s software. I’m tired of being the bad guy for not installing every update and security patch the minute it’s released. Maybe we should spend a little less trigger-happy time firing, and just a few seconds aiming first.

Video credit: Rob Dial on Youtube.

Coding != Career

Thirty years ago, when computers were still new, we didn’t know what to do with them. There was a sense — in that half generation between the widespread availability of computers and the advent of the Internet age — that the world was changing in fundamental ways. No one was sure what was really going on, but this was big. Really big.

Apple_II_plus[1]Schools bought computers because they thought children should be computer literate. They didn’t really know what that meant. They were struggling to prepare students for a world that no one understood. So they scraped together money from the media budget and grants and the PTA and they bought a few computers that they set up in a lab.  The digital age was here.

As students, we played a little Oregon Trail and Lemonade Stand. We practiced our math facts. And then, we learned BASIC programming.  Programming seemed… important. This was a skill we would need. Everyone would have a computer when we became adults. Everyone would need to know how to program it. Plus, the computers came with programming software. It was one of the few things we could do without buying additional software, and those costs weren’t anticipated when they bought the computers.

I learned BASIC in fifth grade, and by sixth I had forgotten it. I learned LOGO, but just the turtle parts of LOGO, not the really cool list handling stuff that made it a useful language. I learned to type on a typewriter, which I then used through high school. Computers didn’t help me learn other things. They were a subject all to themselves. Our school didn’t have enough of them to teach students much of anything about them. And they didn’t know what to teach us anyway. So after sixth grade, I didn’t use a computer again until I was a senior in high school, when I took programming (in Pascal, this time) for fun.

Half a decade later, I found myself with a minor in Systems Analysis and half a dozen different programming languages under my belt. I was teaching a middle school computer applications class. My predecessor had spent about 80% of the course teaching programming. Having no actual curriculum to follow, I scrapped all of it, and focused on applications instead. I felt that students needed to know more about word processing and spreadsheets and presentation software than BASIC.  A year later, I was emphasizing Internet research, evaluating online resources, documenting sources, and using the Internet to disseminate content. These were things my middle school students needed to know. They’re still things they need to know.

source-code-583537_640At the dawn of the new millennium, I was teaching a high school programming course. It was an elective. It was a neat class for students interested in programming. But students don’t need to take auto mechanics to drive a car. They don’t have to study structural engineering to work or live in a high-rise building. They don’t need a degree in economics to work at the bank. The course I taught was an introduction meant to spark interest in the field. I never felt like I was teaching a necessary (or even marketable) skill.

The early 2000s confirmed that. Remember The World is Flat? Friedman talked about going to the drive through fast food restaurant, and talking to someone in India or China to order your food, which you then pick up at the window. Auto companies were making tail light assemblies in the US, shipping them to Malaysia for cheap, unskilled labor to put the light bulbs in, and then shipping them back to go into new cars. Software companies were focusing their systems design efforts in the US, but outsourcing the routine coding to India.

Programming is an entry-level skill. There’s nothing wrong with that. But it’s the kind of position that is more “job” than “career”.  Sure. There’s a bubble right now, and programming skills are in demand. There are also some good reasons to teach programming, because it helps students learn logic, reasoning, and problem solving. But if schools are reacting to the media hype around coding by teaching programming to a generation of would-be programmers, they’re preparing students for a future of unemployment.

Image sources: Wikimedia Commons, Pixabay.