No Moore

Moore’s Law is dead. co-founder Gordon Moore observed in 1965 that the number of components on an integrated circuit was doubling every year. He predicted that this growth would continue for another decade. In 1975, he revised the forecast to every two years.

In simpler terms, it can be said that computing power doubles about every two years, while the cost remains the same. Incredibly, this exponential rate of growth held true for more than 40 years. For the most part, you could expect that the computers available at any given time were about twice as powerful as those two years prior, while the cost was about the same. All told, computing chips available now are more than 2 billion times as powerful as those available in 1965, while the cost is about the same.

Our entire understanding of how technology works is based on this model. If it’s more than a couple years old, it’s probably time to replace it. In schools, we use Moore’s Law as a guideline for everything from planning replacement schedules to estimating depreciation. We replace desktop computers every six years. That’s the point at which they’ve lost about 90% of their value. It was predictable. We could plan for it. We could budget for it.

I got my first smartphone in 2010. I replaced it in 2012. That one would have been replaced in 2014 if I hadn’t dropped it in 2013. I’m making this third one last an extra year to get back on track, but it’s definitely showing its age now.

But Moore’s law is dead. Even the chip manufacturers have acknowledged as much. We’re still going to see growth in computing power, but that growth is going to be slower and less predictable. What’s that going to mean for schools? Lots of things: will keep computers longer.
We have to get over the idea that computers that are a few years old are too out of date to do anything useful. Most of the computers in our classrooms are now eight years old, and it’ll probably be another year before we replace them. It’s not that we’re trying to be cheap or that we don’t want to be cutting edge. The reality is that they still do most of what we need them to do, and we’d rather use technology resources to improve access for students.

We have to worry about durability.
When we bought the Acer netbooks in 2012, we expected to keep them for three years. We knew there would be problems with broken keyboards and cracked screens, but they were half the cost of desktop computers. If we could get three years out of them, we could leverage their mobility and still come out ahead.

But as we head into year five, those weak batteries and poorly designed keyboards are becoming more of a problem. While any life we continue to get out of them is just icing on the cake at this point, it’s a shame to throw them away when they still work fairly well. We’ll take the worst ones out of commission and use them for parts, and we’ll limp along for another year before retiring them. But as we move forward, we need to think about holding on to these things for more years. Build quality and durability will become more important, and we will have less patience for planned obsolescence.

Software will have to be more efficient.
In the early days, computer engineers were all about efficiency. They were working with some pretty tough constraints, and they would spend a lot of time working through performance and resource challenges. That art has largely been lost in the last generation. Applications use more memory / processing power / storage space to do the same things, because those resources have been unlimited for so long. There’s no reason why I should need 8 gb of ram in my computer to run a web browser and a terminal session. But I do because the software is developed without any consideration for hardware limitations.

When Windows Vista came out, there weren’t any computers on the market that could run it. In fact, Microsoft changed the certification from “Vista Ready” to “Vista Capable” so they could actually certify computers to run the new operating system. Within a few months, the hardware caught up, and soon just about everything could run Vista (whether they wanted to or not). The same thing is happening right now with the Oculus Rift. Very few computers meet the system requirements. So the early adopters have to buy new hardware, while everyone else will just have to wait for the industry to catch up.

But in a world where those hardware upgrades are NOT just around the corner, the software developers are going to have to find better ways to improve their products without boosting the system requirements beyond reach.

Costs are going to rise.
I expect costs to go down. That’s how we fuel sustainability. In 2011, we had 3 computing devices for every 7 students. Today, we have 11 computing devices for every 7 students. Over the same period, the school district has received no increase in funding. We’re not bringing in more money, and we’re not spending more money. We have moved things around a little. We’re spending a little less on textbooks, and a little more on computers. But for the most part, the financial side has been pretty flat.

The difference has been the cost of computing devices. When we started buying classroom sets of laptops, we were paying about half as much as we were for desktops. When we started buying Chromebooks, we halved that cost again. We didn’t spend less money, but we bought a lot more devices and improved student access to technology considerably. That, in turn, allowed teachers to better leverage technology to design instruction that meets the individual needs of each learner.

But this year, when I placed my order for Chromebooks for the incoming sixth graders, two things surprised me. First, the specs on the new devices are identical to those from last year. There’s no more memory or storage or processing power. The device is exactly the same as last year. Second, the price hasn’t dropped. Usually, if I buy something for $350 one year, I expect it to be $240 the next. Not this year. The new devices are a few dollars cheaper, but innovation has stalled and pricing is staying the same. That means the whole industry is slowing down. While the longer life cycles are going to help us keep technology longer, the cost stability is going to offset any savings we might have had.

The biggest change is going to be the mindset. The promise that there’s something newer and better right around the corner is a myth. We have to get over our fascination with the new and shiny, and focus a little more on doing great things with the amazing technology we already have.

Photo credits:
Moore’s law chart by shigeru23 from WikiMedia Commons
Silicon Photonics 300mm wafer by Ehsanshahoseini from WikiMedia Commons



Let’s Eat

The doors for lunch opened at noon. We were standing outside in a very crowded hallway, waiting to get in. When the doors opened, there were dozens of volunteers waving flags and welcoming us. We quickly found a table and sat down.

18361834573_ae35442e14_zIt was a reasonably formal lunch. There were cloth napkins and bread plates and dessert forks. The salads were already on the table, and as soon as everyone was seated, we began eating. There were ten people seated at our round table. As I looked around, I noticed that the room was set up with 15 tables across and five tables deep. Behind that was a wide aisle, and then three more identical sections. That’s enough seating for 3,000 people.

As soon as we finished our salads, the plates disappeared. Entrees came next: grilled chicken, mashed potatoes, and a vegetable medley. A few people asked for — and immediately received — vegetarian or gluten-free alternatives.

It’s fun to watch people’s table manners in situations like this. You don’t eat until everyone at the table has been served. The rolls and butter (and, really, everything else) is passed to the right. Your glasses are on the right. Your bread plate is on the left. Salt and pepper get passed together. The other diners didn’t necessarily know the rules, but the servers all did. Always serve from the left, and clear from the right.

Once again, the plates disappeared as soon as we put down our forks. Coffee was served. Desserts were already on the table. Just as we finished dessert, the speaker took the stage and the program began. It was 12:30.

Let’s recap: this is a sit-down, three course lunch for 3,000 people that was served and cleared in half an hour. The food was delicious (and hot). The service was impeccable. The entire experience was top notch. The next morning, I saw the manager working out details for that day’s lunch. I thanked him for his work and told him how impressed I was. “That’s nothing,” he replied. “We do this every week.” He told me that they had done similar meals with up to 8,000 people. “THAT,” he said, “is a challenge.”

So what are the logistics that go into something like this? How do they pull off this impossible feat of feeding 3,000 people in 30 minutes? Over the course of the three days we were there, I paid attention. Here’s how they do it:

Step One: Standardize. It doesn’t matter where you sit. It doesn’t matter what you want. Everyone gets the same thing. If we’re having chicken and mashed potatoes, everyone is getting chicken and mashed potatoes (or the one alternative dish for special diets). There’s one salad dressing choice: you can have salad dressing or not have it. What would you like to drink? There’s iced tea and water on the table, and we’ll serve coffee with dessert. Those are your choices.

Step Two: Everyone has a Job. Teams of servers worked together on groups of tables. Watch the tables and clear the salads as soon as they’re done. Bring out the entrees and uncover them. Serve entire tables quickly. Everyone knows what they’re responsible for, and everyone works together to make sure the job gets done as quickly as possible.

Step Three: Alleviate the Bottlenecks. Those volunteers who were waving flags weren’t just trying to be friendly. They knew that the biggest bottleneck to a quick, efficient lunch is getting people to pick a table and sit down. Their job was to get people into their seats as quickly as possible.

Step Four: Maximize Efficiency. When the trays come out from the kitchen, they’re piled high with covered entrees. One person uncovers them while another clears salad plates and serves entrees. The same trays go back to the kitchen with dirty dishes on them. The whole thing happens in one fluid, choreographed motion. The same thing happens when the entrees are cleared and the coffee is served.

Step Five: Anticipate Problems and Resolve them Quickly. Someone dropped a plate. There’s another one right here. We need another vegan option. It’s right there on the tray. A water glass was knocked over. It was immediately cleaned up. Those things are going to happen. We can’t let them derail the whole event.

Of course, in educational technology, we have our own impossible feats. At the moment, I’m responsible for a technology infrastructure that includes 6,000 computers and tablets, more than 100 printers, about 600 network devices, 250 projectors and interactive whiteboards, and dozens of software applications. My team of five manages mission-critical technologies that handle everything from taking attendance to supporting instruction to selling lunch.

If we ran schools like a business, I could easily justify a staff of 25 people to manage this infrastructure in a reactive way. We’re not talking about high-tech companies here with cutting edge technologies requiring proactive support. Regular mid-sized companies generally have an IT staff member for every 250-300 devices.

My own technology plan, which is now nearly three years old, called for IT staffing levels at 25% of industry norms. That plan included three more people than I have right now, and it anticipated that we would have 1,000 fewer devices. But I’m not complaining. We have it covered. While it would be very bad to lose people, I’m not asking for additional staffing. How do we do it?

Step One: Standardize. For the past 15 years, we have chosen a single desktop computer and a single laptop model. That’s what we buy. Everyone has the same thing. We get really good at supporting that one thing. Every year, we buy about a thousand laptops. They’re all exactly the same. That saves us an enormous amount of time.

Step Two: Everyone Has a Job. The team is divided geographically, but each of my team members also has specializations. If you’re having a problem with a printer or copier, Ryan is the go-to guy. If there’s an issue with a projector or Smart Board, Rick is your man. They rely on one another when they get in over their heads.

Step Three: Alleviate the Bottlenecks. We try to avoid the tasks that take a lot of time without much benefit. For example, it might take hours to diagnose a malware problem and clean up a computer. It’s easier in many cases to re-image the computer, loading a fresh copy of Windows and the applications on it. In many cases, we push configuration changes, updates, and other routine tasks to the computers when they’re idle during the day. That minimizes the amount of repetitive work we have to do on each computer.

Step Four: Maximize Efficiency. The biggest waste of time for us is moving people around. With eight buildings and three technicians, there’s always a problem somewhere where we don’t have any people. The key is to visit each building each day, but also to know what that building needs before we get there. Centralized help ticket management plays a big role in that. The techs also use remote diagnostics to try to resolve problems quickly, or at least determine the causes of trouble, before venturing out to the building.

Step Five: Anticipate Problems and Resolve them Quickly. We know the busiest days of the year for us are the first three days the teachers are back in August. We plan ahead for that. We know what they’re going to need, and we try to provide that extra help or network cable or power strip before they even know they need it. We monitor network activity and frequently know when computers are having problems even before the teachers and students do. There are many many times each week when we see problems and resolve them before anyone even knows what’s happening.

An efficient operation is fun to watch, whether it’s in the education business or the hospitality area. One of the benefits of technology is that it’s supposed to help us take care of low-level, repetitive tasks more easily. That allows us to get more work done, requiring higher-level thinking, reasoning, and problem-solving skills, without increasing staff.

Maybe we could apply some of these lessons to classroom instruction, too. What could we do with the cognitive surplus if the routine aspects of teaching and learning were handled more efficiently?

Photo credit: me.

Reflecting on #OETC12

For the first time in recent memory, I didn’t present at the eTech Ohio Educational Technology Conference this year. Last summer, I decided to take a year off from conference presentations, and instead focus on some of the big questions surrounding the future of education. In the intervening months, I have done a lot of reading, participated in several online networks, and had countless conversations with smart people about public education. It’s clear to me that our test-driven, standards-based, knowledge-transfer approach is not meeting our students’ needs anymore, and we are struggling as public schools to remain relevant.

It was through this lens that I approached the conference this year. I’m looking for ways in which schools are reinventing themselves. While technology plays a role in that change, it cannot be the driver of it. Maybe I’ve been in this job too long. For the last 13 years, I’ve purchased computers, upgraded networks, replaced servers, and improved our technology infrastructure. I’ve fought for funding to replace computers, only to fight again to replace them again six years later. I’ve been responsible for both the purchase and the disposal of hundreds of thousands of dollars’ worth of technology. And while we’ve seen progress in our use of technology in the schools, we haven’t had the transformational change that we need.

So I stayed away from the gadgets. I don’t go in the exhibit hall. I don’t meet with vendors. I’m not interested in buying your amazing solution that’s looking for a problem to solve. For the most part, I also stayed away from the gadget-focused breakout sessions. Instead, I tried to focus on sessions that showcased bold initiatives for changing how teaching and learning is done in our schools.

The Keynotes
The keynotes didn’t help much. On Monday, Dr. Michio Kaku managed to win over a skeptical audience by personally taking credit on behalf of the physics profession for every major technological advance of the last 50 years. He then boldly predicted that technology is going to become more powerful, smaller, and less expensive in the years to come. He referred to wearable technology, like computer chips embedded in contact lenses. He described augmented reality, where a computer gives you real-time information about the things you’re looking at. He talked about hand-held MRI scanners and other amazing medical devices. But he failed to connect these technologies to education. While he advocated for more teaching about technology, he didn’t really address how these new advances are going to revolutionize teaching and learning, or what we, as educators, can do to prepare for it.

Tuesday’s keynote wasn’t much better. In a colossal miscalculation, Sascha Meinrath misjudged his audience. He gave a speech, not a presentation. He read from his stapled manuscript, often pausing mid-sentence to turn the page. He never strayed from the podium, and used no visual aids in his presentation. While the content of his speech was excellent, it was lost in the presentation style. Looking at the Twitter backchannel, for every message about the content of his speech, there were eight messages about the presentation style. Sadly, his points about the struggle between protecting and securing our rights and liberties online were lost on this audience.

On Wednesday, it was Brené Brown’s chance to take a shot at an engaging keynote. It wasn’t looking good. For starters, her career has centered around research on vulnerability and shame. But she immediately connected with her audience, and used a unique blend of humor, self-deprecation, and thoughtful insight to tie the ideas of inadequacy, vulnerability, and fear to teaching, learning, and technology. As she talked, she eloquently wove together the themes of embracing failure, continually striving for improvement, and engaging in learning communities. Given her research interests, her presentation was surprisingly inspirational. It left me with some hope that we really are going to figure out this next generation education business.

Leading Change
This optimism was echoed in the Leadership 2.0 breakout session led by Eric Sheninger. As a principal in New Jersey, Eric began forming a personal learning network a few years ago. He has embraced social networking tools, and transformed his school. He now leads a school where students take responsibility for their own learning. He affects school culture by including his teachers in the decision-making process and empowering them to set policies that are in the best interests of their learners. He shared many strategies for using social media in academic, professional development, and public relations contexts.

This is a stark contrast to the policy discussion held in the technology coordinators’ meeting. Megan Greulich, an attorney for OSBA, shared a number of changes that districts should be making to their policies to reduce liability and comply with state law. Among these changes are the inclusion of cyberbullying prevention language and updates to acceptable use policies to accomodate new e-rate requirements. Many of the attendees expressed frustration that curriculum is being driven through policy, and most seem to be caught between the “low liability” position of attorneys and school boards and the “high useability” needs of teachers and students. It’s difficult to find middle ground that will allow access to needed resources while minimizing the potential for costly litigation.

On the technology side, Joe Bires shared some insights on technology evaluations. Looking at school technology from a technical perspective, Bires specializes in evaluating technology initiatives in schools. He recommends gathering data from as many different perspectives as possible, and looking for the underlying reasons behind surface observations. It is important to consider the project in the context of the school culture, and to recommend a plan of action for improvement once conclusions have been drawn.

Blended Learning
One idea that is finally getting quite a bit of traction is blended learning. A group of teachers from Berea City Schools shared their experience going “paperless” in their classes by using a variety of tools. Essentially, they’re using blended learning resources like Moodle and Google Apps to minimize the amount of printing they’re doing. Our school district has similar goals, and it’s interesting to see that Berea is giving their teachers the flexibility to choose the tools that best fit their needs. David Hamman, a science teacher at Medina High School, shared a similar project. He advocates the use of focus groups and pilot projects to get started with blended learning programs.

I thought that I would see a similar program from Michael Pennington and Garth Holman. These two teachers work in very different schools, 30 miles apart. Their students collaborate on a digital wiki-textbook for their class. In this case, the focus isn’t on creating a textbook for future students to use as much as creating a lasting legacy of the students’ experience in the class. Over the last few years, students have revised, expanded, and improved the textbook to the point where it is used as a class text by other schools. But the primary goal is for the students to share what they’ve learned by creating an authentic, useful product. I was impressed by the teachers’ attitudes toward this project, and the sense of ownership they have been able to inspire in their students.

Google and Education
Despite my aversion to sessions that focus on the tools, I did attend three presentations that specifically addressed the use of Google Apps in education. As a district that heavily uses Google’s tools, it is always interesting to see the innovative things other schools are doing with the same resources. Senior Google Education Evangelist Jaime Casap was a featured speaker, and his presentation focused on the “why” question more than the “how” question. Why should we be using technology in education? What does technology buy us that can’t be done as easily without it? Casap focused on a need for innovation, adaptability, problem solving, and collaboration. He observed that about 10% of the world’s knowledge is available online, and we are acting like we have the world’s information at our fingertips. Our students are going to need a whole new set of literacies to thrive in a world where 70% of the sum of all human knowledge is freely available online, a milestone we’ll reach within the next generation. While Google’s tools help schools do this, his message focused less on what Google Apps can do, and more on the need to change the focus of education.

A group of teachers from Ross Local Schools also shared their use of Google Apps in a breakout session. Their session focused more on the logistics of operating a school than on the actual teaching and learning, but they shared some valuable perspectives on using shared calendars, Google documents, Forms, and Sites to communicate with parents, schedule shared resources like labs and conference rooms, and distribute web site management responsibilities across the school. North Canton’s Eric Curts went into more detail about Forms in his session, illustrating how they can be used for everything from self-grading short cycle assessments to kindergarten registration.

Better Design
One of the areas with which I struggle is that of design. I can create a functional web site. I can evaluate tools and resources and gadgets based on their usefulness, and I can do a cost-benefit analysis. But I’m not so good at creating and evaluating elegance. I have a hard time designing web sites that are intuitive to use and as visually appealing as they are useful. Zach Vander Veen’s session on Principles of Design helped with that. He provided some great ideas for organizing and presenting content in a way that is visually appealing and easy to follow. His focus was on organizing resources for online or blended learning, which fits well with some of our district goals. Alvin Trusty also addressed visual literacy in an entertaining presentation that explained how to create and use images to convey a particular message.

Final Thoughts
Overall, the conference is struggling with the same identity issues that schools are. We have discredited the lecture as the primary and most effective means of teaching, but nearly all of the conference’s sessions perpetuate the model of a speaker standing in front of a group of audience members and talking for 45 minutes. In many sessions, we still focus on the tools, the shiny gadgets that everyone swears are going to make a difference in education. I talked with two of the conference leaders about this, and encouraged them to solicit sessions that are more interactive and participatory. I suggested taking a page from the Educon book, and specifically asking, as part of the proposal form, how the presenter intends to engage the audience in a conversation, rather than just talking for an hour. Hopefully, they’ll encourage presenters to start doing that in future conferences.

It was also disheartening to see that the unconference flavor of last year’s conference was gone. Due to the weather, last year’s conference saw many sessions cancelled, and the attendees who were left started creating their own sessions. It was pretty easy to add a topic to the agenda, find an open room, and hold an impromptu session. This year, I wasn’t aware of any of these spontaneous sessions.

While I still consider the conference to be time well spent, I think, like public schools, it’s time for eTech to rethink the purpose for the conference and work to meet the attendees’ changing needs.