Insecurities

Sometimes, the world isn’t a very nice place.

When the Internet was invented, it was a space for collaboration. The technical challenge of connecting disparate computer systems in remote locations was daunting. The goal was to allow researchers at the various locations to work together, sharing data, analyses, and perspectives.

https://www.flickr.com/photos/michaelsarver/62771138The idea that some members of the community would try to exploit the system to gain access to information or resources that don’t belong to them was inconceivable. The researchers and engineers designing the protocols and tools that eventually became the Internet were focused on getting the system to work. They weren’t worried about security.

That oversight is a common thread for innovation. We often underestimate how new technologies will be misused. Einstein famously regretted his work on the atomic bomb. Kalashnikov was horrified that his rifle was used by so many to cause so much terror. Sometimes, we fail to consider the worst consequences of our best ideas. We’re so focused on making the impossible practical that we don’t spend much time considering whether impossible is such a bad thing.

The Internet has struggled with its underlying insecurity for decades. We have replacements for telnet and ftp that encrypt communications to keep anyone from eavesdropping on them. We have https to allow encrypted web traffic. We use WPA to protect wireless traffic. We can even encrypt email if we have to, but almost no one does. Security is still an afterthought. It’s bolted on to a product or protocol after it already works. Because it’s much simpler, the insecure versions are always more reliable and faster and more efficient and more convenient. We often prioritize these things ahead of security, and continue to use technologies that we know will get us into trouble eventually.

The tech industry didn’t learn from the development of the Internet. Operating systems, too, were designed for a single user who has total access to everything, as were phones and tablets. The idea that this computer might be connected to other computers, and that other software and users might exploit their access is often ignored. Even today, we run into a lot of software that won’t work without complete control over the entire computer and everything on it.

On the network side, system requirements for just about every software package we use require us to eliminate all aspects of security. They often require firewall and filtering exceptions that make our systems more vulnerable. When we point this out, we hit a brick wall. If we can’t prove that we’ve followed their requirements to the letter, they won’t help with any problems we may have.


When you’re developing software, if you design it to work first and then try to add in security later, it doesn’t work right. You end up in cycle where you try to make it more secure, but those efforts break some critical functionality. When you fix those bugs, you introduce more security problems. The result is a program that constantly needs updated, but that never really reaches a point where it’s both secure and reliable.

This process used to be hidden from most people through the beta testing process. Back in the ’90s, it was cool to get betas of new software. You could try out new software in exchange for providing feedback to the developers to help them fix bugs and get the product ready for the general public. I remember being excited about new beta versions of web browsers. It was an exciting time when you could get a glimpse of what’s next.

As we’ve moved along, though, it seems like ALL software is beta software now. Each update comes with that wonderful anticipation of the new problems we’re sure to have. The industry constantly tells us we have to keep all of our software updated, but every time we do, something breaks. That’s okay. There’s a new version next week to fix that major problem. And the update next month will fix the security vulnerabilities introduced by this fix.

We’re living in a world where software doesn’t have to work reliably or securely. It just has to be “good enough” for now. Ship new versions quickly and regularly, and don’t worry too much about it. Every time I start up my phone or my computer or my tablet or my Chromebook, I have a nice new collection of crappy software to install.

So what’s the solution? How do we move away from this endless cycle? I think it comes down to the license agreement. You know, those terms you agree to without reading every time software tries to install or update? In Google’s case, the relevant parts are sections 13 and 14 (some of which I’ve left out). They put it in all caps so you know it’s important:

13.3 IN PARTICULAR, GOOGLE, ITS SUBSIDIARIES AND AFFILIATES, AND ITS LICENSORS DO NOT REPRESENT OR WARRANT TO YOU THAT:
(A) YOUR USE OF THE SERVICES WILL MEET YOUR REQUIREMENTS,
(B) YOUR USE OF THE SERVICES WILL BE UNINTERRUPTED, TIMELY, SECURE OR FREE FROM ERROR,
(D) THAT DEFECTS IN THE OPERATION OR FUNCTIONALITY OF ANY SOFTWARE PROVIDED TO YOU AS PART OF THE SERVICES WILL BE CORRECTED.

13.6 GOOGLE FURTHER EXPRESSLY DISCLAIMS ALL WARRANTIES AND CONDITIONS OF ANY KIND, WHETHER EXPRESS OR IMPLIED, INCLUDING, BUT NOT LIMITED TO THE IMPLIED WARRANTIES AND CONDITIONS OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT.

Translation: I don’t know what you think this software is going to do, or if you’ve bought into all of our marketing hype, but no matter how low your expectations are, you should lower them more. 

14. LIMITATION OF LIABILITY

14.1 SUBJECT TO OVERALL PROVISION IN PARAGRAPH 13.1 ABOVE, YOU EXPRESSLY UNDERSTAND AND AGREE THAT GOOGLE, ITS SUBSIDIARIES AND AFFILIATES, AND ITS LICENSORS SHALL NOT BE LIABLE TO YOU FOR:

(A) ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL CONSEQUENTIAL OR EXEMPLARY DAMAGES WHICH MAY BE INCURRED BY YOU, HOWEVER CAUSED AND UNDER ANY THEORY OF LIABILITY.. THIS SHALL INCLUDE, BUT NOT BE LIMITED TO, ANY LOSS OF PROFIT (WHETHER INCURRED DIRECTLY OR INDIRECTLY), ANY LOSS OF GOODWILL OR BUSINESS REPUTATION, ANY LOSS OF DATA SUFFERED, COST OF PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES, OR OTHER INTANGIBLE LOSS;

(B) ANY LOSS OR DAMAGE WHICH MAY BE INCURRED BY YOU, INCLUDING BUT NOT LIMITED TO LOSS OR DAMAGE AS A RESULT OF:

(I) ANY RELIANCE PLACED BY YOU ON THE COMPLETENESS, ACCURACY OR EXISTENCE OF ANY ADVERTISING, OR AS A RESULT OF ANY RELATIONSHIP OR TRANSACTION BETWEEN YOU AND ANY ADVERTISER OR SPONSOR WHOSE ADVERTISING APPEARS ON THE SERVICES;

(II) ANY CHANGES WHICH GOOGLE MAY MAKE TO THE SERVICES, OR FOR ANY PERMANENT OR TEMPORARY CESSATION IN THE PROVISION OF THE SERVICES (OR ANY FEATURES WITHIN THE SERVICES);

(III) THE DELETION OF, CORRUPTION OF, OR FAILURE TO STORE, ANY CONTENT AND OTHER COMMUNICATIONS DATA MAINTAINED OR TRANSMITTED BY OR THROUGH YOUR USE OF THE SERVICES;

14.2 THE LIMITATIONS ON GOOGLE’S LIABILITY TO YOU IN PARAGRAPH 14.1 ABOVE SHALL APPLY WHETHER OR NOT GOOGLE HAS BEEN ADVISED OF OR SHOULD HAVE BEEN AWARE OF THE POSSIBILITY OF ANY SUCH LOSSES ARISING.

Translation: whatever happens, it’s not our fault. Even if we do it on purpose.

The software companies have created conditions of use that eliminate any sense of accountability on their part. They won’t guarantee that their product will do anything, and they won’t be responsible for any damage created by it. Even if they willfully cause problems or data loss, lie to you about the product, and interfere with other technologies you’re using, they have no liability.

I keep waiting for the courts to throw these things out. End users are clicking through these agreements without reading them because they have no choice. They’re not making informed decisions to give away their rights. They’re not so excited to try out new software that they’re setting up test environment that have no important data or work to do. They’re just trying to get to the Internet, to check their email, to open a PDF file, and to get some work done. Where’s the stable, reliable software product that helps them do that?

Without any incentive to ship reliable, stable, secure code, we’re going to continue to be inundated with updates. Every time there’s a security breach or an internet outage or a loss of data, we’re going to blame the end user. “We told you not to trust our software.” “Why don’t you have a backup.” “What do you MEAN you’re still using that horrible old software from next month.” “Don’t you dare delay this update.”

So until something changes, we’ll keep installing updates, and then update the updates. And then reboot to find that there’s a bug fix for the update.

Photo credit: Michael Sarver on Flickr

 

 

In Pursuit of Tech Standards

The Ohio Department of Education is soliciting feedback through December 31 on their new Technology Learning Standards. [Update (1/12/16): many of the links are now broken, but the new standards are here.]

This is frustrating. And, largely, meaningless.

The new standards are a revision of the 2003 Academic Content Standards for Technology (another link here, since the ODE one is going to break soon). The 2003 standards were a monumental exercise in compromise. They were essentially based on the original ISTE Standards from 1998. But because “technology” has many different definitions, and a8720604364_85c5931a14_zt the time, there was a lot of money available to schools wanting to improve the ways they use technology in schools, everyone wanted in on the action. The industrial technology folks pushed to include more standards related to CAD, CNC, robotics, and manufacturing. The digital arts proponents wanted standards for digital and graphic design. The business department saw technology through the productivity lens, and lobbied for so-called computer literacy and productivity standards. The media folks saw the information literacy train coming, and jumped on board. The result was a 360-page behemoth that was impossible to implement.

It was interesting, at the time, that we had technology standards at all. Beginning with the SchoolNet Plus program in 1995, the state’s goal was to integrate technology into the classroom. Every conversation focused on instructional integration. Funding was provided for CLASSROOM technology, and schools were emphatically discouraged from building computer labs and other structures that separated the technology from classroom instruction. So it was odd that these technology standards came along at the end of the Academic Content Standards era. Tech hadn’t been included in the Language Arts standards or Math standards or Science standards. Despite the stated objective of integrating tech, they didn’t actually do it when writing the standards.

digifest-digital-screenSo while we were unwrapping standards and mapping assessments and trying to come to terms with the idea that all schools should be teaching the same thing in fifth grade math, technology was added on. These standards were tied to funding through the e-rate process. Schools were forced to explain in their technology plans how technology integration was being accomplished at every grade grade level in every subject area. Our district may have included some snarky responses in this area, especially when explaining how we were integrating technology into middle school physical education (we weren’t) and elementary foreign language (a subject we don’t teach).

To say the least, it was an uphill battle. Schools were rated based on reading and math scores. Those tests measured student achievement of the standards in those areas, which did not include technology. We were assured that technology would be fully integrated into the next generation of academic content standards. In the meantime, we would do what we could to prepare our students for success in our technology-rich culture without actually requiring anyone to use any technology anywhere.

So when the new content standards came along in 2010, they should have finally included the technology components that we’ve been talking about integrating into the core subject areas for a generation now. Finally, we’re going to make sure our students have the technology skills they need. We’re going to embed them in the core subjects, and we’re going to use them to help differentiate instruction, improve student collaboration and communication skills, and increase the academic rigor of our academic programs. Our students will increasingly analyze and synthesize content from multiple sources, and will use their critical thinking skills to combine knowledge in new ways to solve complex problems.

Except that they may have forgotten to include technology in the new standards. Again.

So, here we are. We’re developing new tech standards. We’re basing those standards on Ohio’s ridiculous 2003 standards, instead of building on the national and international work that’s been done in the last 20 years. We reduced the number of standards to make it easier to implement them. We are making them intentionally vague so schools can claim to be implementing them without actually having to change anything they’re already doing. We will push these standards out to the schools with no support and no directive to use them. And we’ll wonder why there’s not a common set of technology standards implemented across the state with fidelity. We’ll complain that our students don’t have the skills they need to compete internationally. We’ll talk about how irrelevant our schools are. All of the kids in school now will grow up and graduate and they’ll be replaced with a new set of students looking to us to prepare them for their future.

Then, we’ll do it again.

Photo Credit: Lupuca on Flickr.
Photo Credit: JISC and Matt Lincoln.

A More Perfect History

Last week, the College Board released a new version of the AP U.S. History Course and Exam Description. This document, last revised in 2014, outlines the content that should constitute an Advanced Placement American History course. Ideally, students taking this course pass the exam at the end of the year that entitles them to college credit for their achievement.

7522707282_46e00dc43e_zThe United States does not have a national curriculum for American History. The Common Core standards, an effort to unify the curriculum taught in American schools, only cover reading and math. The AP guidelines are the closest thing we have to a national standard for how this subject should be approached in high school.

The new standards come a year after strong opposition to the 2014 version. That revision emphasized comprehension, interpretation , and synthesis of history instead of merely recalling the names and dates of important milestones. Critics claimed that it undermined the idea of American exceptionalism, and fostered a view of American history that is too negative and political. Several states moved to ban the course from being taught in their schools.

In academic circles, we call this shift toward analysis, synthesis, and application an increase in academic rigor. As we continue to move into the age of information abundance, it becomes increasingly important for students to evaluate the information they’re getting, make connections among content from diverse sources, assess bias and frame of reference, and draw their own conclusions. They apply this deeper understanding  of history in new contexts, ostensibly to keep from repeating it.

Unfortunately, some of those conclusions don’t necessarily paint the United States in a positive light. After looking at the facts, one might conclude, for example, that the Boston Tea Party was actually an act of terrorism. Or, maybe, the strained relations between Europeans and native tribes had more to do with the Europeans dismissing them as savages, taking and destroying their resources, and constantly breaking treaties than with the natives acting unreasonably hostile toward white settlers. It’s quite possible that rounding up Japanese Americans, most of whom were United States citizens, and locking them up in interment camps after confiscating their homes and property was a heinous violation of their civil and human rights. One might conclude that detaining 780 people in the aftermath of 9/11 without charge or trial, and then systematically torturing them  over the course of a decade poses a stark contrast to the certain inalienable rights endowed to them by their creator.

Fortunately, the new version of the course re-instills those patriotic American ideals that make our citizens believe that this is the greatest country in the history of the world. Our nation is founded on the ideals of liberty, citizenship, and self-governance. Just don’t get too caught up in that definition of “liberty,” and be careful about that “self-governance” thing if you’re black or female or poor. George Washington, Benjamin Franklin, and Thomas Jefferson are fearless leaders to be revered, and have more than earned their places on our currency. Let’s set aside Washington’s blundering that would have lost the revolutionary war if the French hadn’t conveniently saved the day, Franklin’s inability to keep his hands to himself, and Jefferson’s substantial bi-racial posterity. The Declaration of Independence and the Constitution should be revered as sacred documents, unless you take that bit about being created equal too seriously, or unless the unelected Supreme Court issues a ruling you don’t agree with. We were certainly the determining factors in ending both world wars, and the U.S. is the only country who realized that the cold war could be ended by simply telling Mr. Gorbachev to “tear down this wall.” Let’s conveniently omit the fact that the United States, 70 years later, is still the only country to have actually used a nuclear weapon. Don’t get too caught up in the details. We’re awesome, and we know it.

A generation ago, I took this AP American History course. We skipped most of the dates and facts. The textbook spent most of the year in the bottom of my locker. The units focused on essential questions that were primarily answered through the examination of primary sources. We learned to interpret history for ourselves. We learned to assess bias. We learned about different kinds of oral and written accounts, and how to determine why they were created, by whom, and when. One of the units focused on the cause of the civil war. Slavery was a contributing factor. But it wasn’t the only factor, and it probably wasn’t the driving force. Slavery as a human rights issue was certainly not as important as slavery as an economic issue. But we didn’t blindly read an over-processed, committee-driven, negotiated account in a textbook about why there was a civil war. We explored the topic ourselves.

We didn’t cover most of the course. We glossed over almost all of the dates and names. I don’t think the teacher was overly concerned with our exam scores. In fact, we didn’t have any assessments or grading at all, apart from the final exam. We were intrinsically motivated, and the subject was made interesting by the approach taken by the teacher. It was certainly a time before high-stakes accountability.

I scored well enough to earn six college credits and was exempted from taking Western Civilization as a college Freshman. I don’t remember much about the exam, except that in the essay, I argued that affirmative action programs were discriminatory. I’m pretty sure I criticized Lincoln in the same essay.

I love my country. There’s video and photos all over the Internet of me waving flags and singing patriotic songs. I know most of the words to the Pledge of Allegiance (even though I think it’s a really creepy nod to fascism). I sing the words to the Armed Forces Medley and Stars and Stripes Forever every time I hear them. But I think our country can be better. There’s lots of room for improvement. And we don’t get better by ignoring the inconvenient misdeeds of our past.  Our students need to study all of American history, not just the parts that make us look good. They need to draw conclusions, identify and acknowledge misdeeds, and resolve to prevent their leaders from walking down those same paths.

Maybe that’s what the critics are afraid of.