Coding != Career

Thirty years ago, when computers were still new, we didn’t know what to do with them. There was a sense — in that half generation between the widespread availability of computers and the advent of the Internet age — that the world was changing in fundamental ways. No one was sure what was really going on, but this was big. Really big.

Apple_II_plus[1]Schools bought computers because they thought children should be computer literate. They didn’t really know what that meant. They were struggling to prepare students for a world that no one understood. So they scraped together money from the media budget and grants and the PTA and they bought a few computers that they set up in a lab.  The digital age was here.

As students, we played a little Oregon Trail and Lemonade Stand. We practiced our math facts. And then, we learned BASIC programming.  Programming seemed… important. This was a skill we would need. Everyone would have a computer when we became adults. Everyone would need to know how to program it. Plus, the computers came with programming software. It was one of the few things we could do without buying additional software, and those costs weren’t anticipated when they bought the computers.

I learned BASIC in fifth grade, and by sixth I had forgotten it. I learned LOGO, but just the turtle parts of LOGO, not the really cool list handling stuff that made it a useful language. I learned to type on a typewriter, which I then used through high school. Computers didn’t help me learn other things. They were a subject all to themselves. Our school didn’t have enough of them to teach students much of anything about them. And they didn’t know what to teach us anyway. So after sixth grade, I didn’t use a computer again until I was a senior in high school, when I took programming (in Pascal, this time) for fun.

Half a decade later, I found myself with a minor in Systems Analysis and half a dozen different programming languages under my belt. I was teaching a middle school computer applications class. My predecessor had spent about 80% of the course teaching programming. Having no actual curriculum to follow, I scrapped all of it, and focused on applications instead. I felt that students needed to know more about word processing and spreadsheets and presentation software than BASIC.  A year later, I was emphasizing Internet research, evaluating online resources, documenting sources, and using the Internet to disseminate content. These were things my middle school students needed to know. They’re still things they need to know.

source-code-583537_640At the dawn of the new millennium, I was teaching a high school programming course. It was an elective. It was a neat class for students interested in programming. But students don’t need to take auto mechanics to drive a car. They don’t have to study structural engineering to work or live in a high-rise building. They don’t need a degree in economics to work at the bank. The course I taught was an introduction meant to spark interest in the field. I never felt like I was teaching a necessary (or even marketable) skill.

The early 2000s confirmed that. Remember The World is Flat? Friedman talked about going to the drive through fast food restaurant, and talking to someone in India or China to order your food, which you then pick up at the window. Auto companies were making tail light assemblies in the US, shipping them to Malaysia for cheap, unskilled labor to put the light bulbs in, and then shipping them back to go into new cars. Software companies were focusing their systems design efforts in the US, but outsourcing the routine coding to India.

Programming is an entry-level skill. There’s nothing wrong with that. But it’s the kind of position that is more “job” than “career”.  Sure. There’s a bubble right now, and programming skills are in demand. There are also some good reasons to teach programming, because it helps students learn logic, reasoning, and problem solving. But if schools are reacting to the media hype around coding by teaching programming to a generation of would-be programmers, they’re preparing students for a future of unemployment.

Image sources: Wikimedia Commons, Pixabay.


21st Century Workers

I learned early to play the “school game.” Do what is expected of you. Please the teacher. Don’t make trouble. Don’t ask questions. They will give you information. Then, they’ll ask you to give it back. If you follow the rules, you’ll be rewarded with a good grade. Good grades will get you into a good college, which will get you a good job. It’s all good. We have it all figured out. Just do what you’re told.

But we don’t really have it figured out anymore. While a college education makes you twice as employable as not having one, some are suggesting that the cost of higher education does not make it a worthwhile investment. And with the unemployment rate among recent college grads at its highest level on record, a whole lot of grads are going to have trouble paying off those student loans.

At the same time, it’s becoming clear that the job market is demanding skills that schools are not providing. NPR’s Planet Money team took a look at a factory in South Carolina this week. The series examines the changing workforce, and the changing demands placed on current and prospective workers. Reporter Adam Davidson asked if he could get a job there:

“No,” he said. “The risk of having you being able to come up to speed with training would be a risk I wouldn’t be willing to take.”

To become [a good worker], I’d have to learn the machine’s computer language. I’d have to learn the strengths of various metals and their resistance to various blades. And then there’s something I don’t believe I’d ever be able to achieve: the ability to picture dozens of moving parts in my head. Half the people… trained over the years just never were able to get that skill.

The company can teach the knowledge needed to do the job. That’s not a problem. What they can’t teach is the ability to visualize what’s going on. They can’t teach the innovative thinking and problem solving skills. Their workers have to come in with those skills.

Where are they going to get those skills? In the K-12 world, we’re still focused on imparting knowledge. Our teachers are content specialists. They’re experts at teaching information. And while there’s been a push toward higher level thinking skills in education for longer than I’ve been part of the field, there’s still not a lot of it going on in our schools.

In a different NPR piece this week, Emily Hanford examined flipped physics classrooms at the college level. Harvard has determined that lecturing is not an effective  teaching technique. And with information easily and freely available to anyone, anytime, anywhere, some would argue that spending class time to  impart information is now irresponsible. In physics, they found that students could memorize the formulas and plug them in to get the correct answers to problems, but that doesn’t mean they had an understanding of the underlying concepts. Harvard professor Eric Mazur has changed his approach to teaching physics. Students are expected to do background reading to get the “information” before class. Then, in class, they focus on making sense of that information. Class time is devoted to application of the concepts, not memorization of the facts.

We’re starting to see this in K-12 as well. More and more teachers are changing their approaches to embrace next generation skills. There are plenty of reliable sources for explaining the information. We don’t need teachers to do that anymore. We need teachers to help students make sense of the information, to draw connections between the things they’ve learned, to apply their understanding of concepts in order to solve challenging problems.

So our students will leave with knowledge. Sure. Yes. Of course. But they’ll also know how to learn. They’ll know how to connect ideas. They’ll know how to apply their understanding of one concept to different situations. They’ll be ready to face the challenges of their generation.

Photo credit: Avram Cheaney on Flick.