Coding != Career

Thirty years ago, when computers were still new, we didn’t know what to do with them. There was a sense — in that half generation between the widespread availability of computers and the advent of the Internet age — that the world was changing in fundamental ways. No one was sure what was really going on, but this was big. Really big.

Apple_II_plus[1]Schools bought computers because they thought children should be computer literate. They didn’t really know what that meant. They were struggling to prepare students for a world that no one understood. So they scraped together money from the media budget and grants and the PTA and they bought a few computers that they set up in a lab.  The digital age was here.

As students, we played a little Oregon Trail and Lemonade Stand. We practiced our math facts. And then, we learned BASIC programming.  Programming seemed… important. This was a skill we would need. Everyone would have a computer when we became adults. Everyone would need to know how to program it. Plus, the computers came with programming software. It was one of the few things we could do without buying additional software, and those costs weren’t anticipated when they bought the computers.

I learned BASIC in fifth grade, and by sixth I had forgotten it. I learned LOGO, but just the turtle parts of LOGO, not the really cool list handling stuff that made it a useful language. I learned to type on a typewriter, which I then used through high school. Computers didn’t help me learn other things. They were a subject all to themselves. Our school didn’t have enough of them to teach students much of anything about them. And they didn’t know what to teach us anyway. So after sixth grade, I didn’t use a computer again until I was a senior in high school, when I took programming (in Pascal, this time) for fun.

Half a decade later, I found myself with a minor in Systems Analysis and half a dozen different programming languages under my belt. I was teaching a middle school computer applications class. My predecessor had spent about 80% of the course teaching programming. Having no actual curriculum to follow, I scrapped all of it, and focused on applications instead. I felt that students needed to know more about word processing and spreadsheets and presentation software than BASIC.  A year later, I was emphasizing Internet research, evaluating online resources, documenting sources, and using the Internet to disseminate content. These were things my middle school students needed to know. They’re still things they need to know.

source-code-583537_640At the dawn of the new millennium, I was teaching a high school programming course. It was an elective. It was a neat class for students interested in programming. But students don’t need to take auto mechanics to drive a car. They don’t have to study structural engineering to work or live in a high-rise building. They don’t need a degree in economics to work at the bank. The course I taught was an introduction meant to spark interest in the field. I never felt like I was teaching a necessary (or even marketable) skill.

The early 2000s confirmed that. Remember The World is Flat? Friedman talked about going to the drive through fast food restaurant, and talking to someone in India or China to order your food, which you then pick up at the window. Auto companies were making tail light assemblies in the US, shipping them to Malaysia for cheap, unskilled labor to put the light bulbs in, and then shipping them back to go into new cars. Software companies were focusing their systems design efforts in the US, but outsourcing the routine coding to India.

Programming is an entry-level skill. There’s nothing wrong with that. But it’s the kind of position that is more “job” than “career”.  Sure. There’s a bubble right now, and programming skills are in demand. There are also some good reasons to teach programming, because it helps students learn logic, reasoning, and problem solving. But if schools are reacting to the media hype around coding by teaching programming to a generation of would-be programmers, they’re preparing students for a future of unemployment.

Image sources: Wikimedia Commons, Pixabay.



Address Space

One of the problems faced by the designers of the Internet was the ability for computers to find one another. If a global network of computers were to function in a decentralized way, there needed to be a way for any computer to send information to any other. An address scheme was created in the late 1970s called Internet Protocol. With this system, each computer on the Internet is given a unique address, called an IP address. This address is a 32 bit number, divided into four octets. Each octet narrows the location further, allowing traffic to be efficiently routed between any two computers.
Usually, the octets are written as a sequence of four numbers, each ranging from 0-255. For example, was the server at Miami where Mrs. Schinker and I met. Our web server at school is Theoretically, there are about 4 billion of these addresses, making it possible for the Internet to route traffic among 4 billion computers.

In 1983, when this system first started seeing wide adoption, this was plenty. There were about two million computers in the United States, and almost none of them were connected to this network. At that time, there were about 4 billion people on the planet. The idea that every person on earth would have a computer on this network was inconceivable.

These days, of course, that 4 billion number is looking smaller and smaller. With more cell phones on the planet than people, some estimates indicate that there are as many as 9 billion devices online now. We exhausted that 4 billion address space a long time ago. The thing that has kept the Internet from grinding to a halt for the last decade is a workaround called “network address translation” (NAT). If you compare an IP address to the street addresses we’re all familiar with, then NAT is like assigning apartment numbers. In this online city, though, almost every building is a high-rise apartment.

Take, for example, my school district. We have about 6,000 devices in our schools, almost all of which are connected to the Internet. But because we use NAT, every computer appears to use the same IP address, So, from the Internet’s perspective, it just treats our whole school district as one computer. That’s how we can get 9 billion devices with only 4 billion addresses. But that only goes so far. At some point, we are going to need more addresses.

The smart people who keep the Internet working saw this problem coming twenty years ago. They created a new address scheme, called IPv6, which uses 128 bits instead of 32. That means there are 340 undecillion possible addresses. That’s 3.4 x 10^38 addresses.

The bad news is that much of the current Internet still doesn’t support IPV6. Android devices have a lot of problems. Mac OSX 10.7 – 10.10 tries not to use it. Windows didn’t support it until Vista. Cisco switches didn’t support it until version 15. Hardware manufacturers were slow to adopt it because nobody was using it. Nobody was using it because, well, no devices supported it.

But the time has come. With no more old addresses available, we have to transition. It’s going to be painful and expensive. It’s going to take a lot of time. Hopefully, most people won’t even notice.


The good news is that we will only ever have to do this once. The new system has plenty of addresses. 340 undecillion is a really big number. It’s enough to give everyone on the planet their own private Internet without even scratching the surface. It’s enough for every grain of sand on earth to have as many addresses are there are grains of sand on the earth. If the old address scheme supported an Internet the size of a golf ball, the new one would be the size of the sun. If every atom on the surface of the Earth had an address, there would be enough left over for 100 more planets. I think you get the point. There are lots of addresses.

I can’t imagine a world where we could ever possibly use that many. Just like they couldn’t in 1983.

Photo credit: Penn State University



A More Perfect History

Last week, the College Board released a new version of the AP U.S. History Course and Exam Description. This document, last revised in 2014, outlines the content that should constitute an Advanced Placement American History course. Ideally, students taking this course pass the exam at the end of the year that entitles them to college credit for their achievement.

7522707282_46e00dc43e_zThe United States does not have a national curriculum for American History. The Common Core standards, an effort to unify the curriculum taught in American schools, only cover reading and math. The AP guidelines are the closest thing we have to a national standard for how this subject should be approached in high school.

The new standards come a year after strong opposition to the 2014 version. That revision emphasized comprehension, interpretation , and synthesis of history instead of merely recalling the names and dates of important milestones. Critics claimed that it undermined the idea of American exceptionalism, and fostered a view of American history that is too negative and political. Several states moved to ban the course from being taught in their schools.

In academic circles, we call this shift toward analysis, synthesis, and application an increase in academic rigor. As we continue to move into the age of information abundance, it becomes increasingly important for students to evaluate the information they’re getting, make connections among content from diverse sources, assess bias and frame of reference, and draw their own conclusions. They apply this deeper understanding  of history in new contexts, ostensibly to keep from repeating it.

Unfortunately, some of those conclusions don’t necessarily paint the United States in a positive light. After looking at the facts, one might conclude, for example, that the Boston Tea Party was actually an act of terrorism. Or, maybe, the strained relations between Europeans and native tribes had more to do with the Europeans dismissing them as savages, taking and destroying their resources, and constantly breaking treaties than with the natives acting unreasonably hostile toward white settlers. It’s quite possible that rounding up Japanese Americans, most of whom were United States citizens, and locking them up in interment camps after confiscating their homes and property was a heinous violation of their civil and human rights. One might conclude that detaining 780 people in the aftermath of 9/11 without charge or trial, and then systematically torturing them  over the course of a decade poses a stark contrast to the certain inalienable rights endowed to them by their creator.

Fortunately, the new version of the course re-instills those patriotic American ideals that make our citizens believe that this is the greatest country in the history of the world. Our nation is founded on the ideals of liberty, citizenship, and self-governance. Just don’t get too caught up in that definition of “liberty,” and be careful about that “self-governance” thing if you’re black or female or poor. George Washington, Benjamin Franklin, and Thomas Jefferson are fearless leaders to be revered, and have more than earned their places on our currency. Let’s set aside Washington’s blundering that would have lost the revolutionary war if the French hadn’t conveniently saved the day, Franklin’s inability to keep his hands to himself, and Jefferson’s substantial bi-racial posterity. The Declaration of Independence and the Constitution should be revered as sacred documents, unless you take that bit about being created equal too seriously, or unless the unelected Supreme Court issues a ruling you don’t agree with. We were certainly the determining factors in ending both world wars, and the U.S. is the only country who realized that the cold war could be ended by simply telling Mr. Gorbachev to “tear down this wall.” Let’s conveniently omit the fact that the United States, 70 years later, is still the only country to have actually used a nuclear weapon. Don’t get too caught up in the details. We’re awesome, and we know it.

A generation ago, I took this AP American History course. We skipped most of the dates and facts. The textbook spent most of the year in the bottom of my locker. The units focused on essential questions that were primarily answered through the examination of primary sources. We learned to interpret history for ourselves. We learned to assess bias. We learned about different kinds of oral and written accounts, and how to determine why they were created, by whom, and when. One of the units focused on the cause of the civil war. Slavery was a contributing factor. But it wasn’t the only factor, and it probably wasn’t the driving force. Slavery as a human rights issue was certainly not as important as slavery as an economic issue. But we didn’t blindly read an over-processed, committee-driven, negotiated account in a textbook about why there was a civil war. We explored the topic ourselves.

We didn’t cover most of the course. We glossed over almost all of the dates and names. I don’t think the teacher was overly concerned with our exam scores. In fact, we didn’t have any assessments or grading at all, apart from the final exam. We were intrinsically motivated, and the subject was made interesting by the approach taken by the teacher. It was certainly a time before high-stakes accountability.

I scored well enough to earn six college credits and was exempted from taking Western Civilization as a college Freshman. I don’t remember much about the exam, except that in the essay, I argued that affirmative action programs were discriminatory. I’m pretty sure I criticized Lincoln in the same essay.

I love my country. There’s video and photos all over the Internet of me waving flags and singing patriotic songs. I know most of the words to the Pledge of Allegiance (even though I think it’s a really creepy nod to fascism). I sing the words to the Armed Forces Medley and Stars and Stripes Forever every time I hear them. But I think our country can be better. There’s lots of room for improvement. And we don’t get better by ignoring the inconvenient misdeeds of our past.  Our students need to study all of American history, not just the parts that make us look good. They need to draw conclusions, identify and acknowledge misdeeds, and resolve to prevent their leaders from walking down those same paths.

Maybe that’s what the critics are afraid of.