Faking It

When the web was new, we were very worried about the reliability of online content. We were moving from an environment where the means of publication were controlled. There were gatekeepers who controlled what content got published. They ensured that the information the public consumed was accurate and reliable. At least, that was the idea.

With the web, that changed because everyone suddenly had the ability to publish content. Anyone could make a web page. So we had to figure out how assess the credibility of a web site. I remember, when working on my Master’s degree in the late 90s, that information literacy was just starting to become a thing.  We were worried that our students might believe everything they read online.  So we tried to teach them the look critically at information resources. That work continues now, nearly a generation later.

But things have become more difficult. With the advent of Photoshop and other image editing software, it’s pretty easy to edit pictures to enhance or omit details. Sometimes, this is done for reasons of vanity, but it’s often done for political reasons as well. So now, in addition to assessing the reliability of web sites and news stories, we have to question the legitimacy of photographs, too. It’s okay. We’re getting better at it. We’re becoming more skeptical. Hopefully, we’re asking questions and citing sources and applying deductive reasoning and the scientific method to separate fact from fiction. I mean, it’s not like we’re just throwing up our hands and saying everything we don’t like is fake, right?

But here we go, making things harder again. Last year, Adobe showed a demo of its new VoCo product. With a 20 minute sample of a speaker’s voice, you can quickly and easily edit the audio and make the speaker say anything you want.

This isn’t out yet, but it’s coming in a future version of Adobe Creative Cloud, a widely used graphic arts package that includes Photoshop, InDesign, and other “standard” tools used by professionals and amateurs alike to edit digital work.

So now, you can take an audio recording and edit it as easily as a word processing document to make the speaker say anything you want. That’s really cool, but also terrifying. But wait, there’s more. Check out this research project at Stanford:

See what they’re doing there? Using nothing more complicated than a webcam, they’re mapping facial features onto an existing video. If you pair these two technologies together, you can create a video that makes any public figure say anything you want.

Sure, it’s not perfect. This is still complicated software. It’s cumbersome to use, especially when you’re trying to put all the pieces together. And the results aren’t great. You can tell from this video that the technology is not quite at the point where it’s going to fool most people.

But our job just got harder. On one level, it not too bad that we have to teach our students to think critically about video and audio. We really should have seen that coming. And we’re teaching students to think critically about information, regardless of the form. They just need to be aware that video and audio, like pictures and text, can be manipulated. Information has meta information. HOW do you know? What is the source for the position you’re taking? Why do you trust that source? We need to challenge our students and each other to make the information about the information just as important as the content itself.

But the real problem is the plausible deniability. We can no longer prove, beyond a shadow of a doubt, that someone said something or did something. You have video of me holding up a convenience store? Prove that it’s me and that it hasn’t been altered. You claim you have an audio recording of a public figure making misogynist / racist / anti-semitic / anti-American comments? Prove that it hasn’t been doctored. Because it’s easy to fabricate these things now, we can use the technology as a scapegoat to disavow responsibility for our words and actions.

Information literacy includes the skills of selecting and curating information, assessing reliability and credibility, and then using that information in responsible ways. I’m not convinced that it’s possible to do that anymore. And you can’t prove that I’m wrong.

 

Acknowledgment: Almost all of this came from the RadioLab Story “Breaking News.” Those guys do fantastic work. You should go listen.

Also, I have no idea where the Lincoln photo originally came from. It’s literally all over the place. No, I don’t have permission to use it.

Advertisements

Facts and Feelings

We are living in an age when information is no longer scarce. The Internet gave everyone access to the information. It was sold to us as an information superhighway. Think of all of the wonderful resources you have right at your fingertips with this fantastic, revolutionary technology. Then, interactive web tools came along and made it really easy for anyone to post content online. We moved away from broadcast media, where a single entity informs the masses, to a system where everyone has a voice. It’s a democracy of information. Finally, mobile technologies became practical, so those tools are now available to us wherever we are.

Information is free, in both senses of the word. Questions no longer go unanswered, opinions no longer go unshared. It’s truly a wonderful and amazing time to be living.

flat-earth-1054350_960_720But there’s also a problem. We are overwhelmed by content. When I was in school, we used to struggle to find enough information to write cohesive research papers. Now, finding enough information is as easy as a Google search. We have to be able to filter that information to find the most relevant content, evaluate the accuracy and reliability of the content we’re finding from disparate sources, and build on that knowledge to spark new ideas and new solutions to complex problems.

You’re probably still with me at this point. If you’re working in higher education, you have an anecdote to insert here about kids these days thinking that “Google” and “Research” are synonyms. Many in K-12 are thinking I’m rehashing old ideas, because we’ve been doing all of these things for years and talking about 21st Century Skills since the 21st century started. If we’re sitting in a room having this conversation, this is the point at which someone will disparagingly refer to Wikipedia. After all, anyone can change it. How reliable can that be? Once we’ve made that turn, we’re off on a track that leads to me ranting about how Wikipedia is actually a pretty reliable source because of their insistence on citations and their transparency about where the information comes from. My challenge to Wikipedia haters is to change a basic fact on the site to be wrong, and see how long that lasts before someone fixes it.

But that’s not where we’re going today. I want to talk about something more important than whether your ninth grade English teacher will let you cite Wikipedia as a source.

What if you want to mislead people? Everyone on the Internet has a megaphone. Everyone can be a content creator. Everyone can be a publisher. Let’s say I want to convince people of something crazy. Maybe I want people to think that the Earth is flat. How would I do that?

I could start by referencing the ancient Greeks, who believed the Earth to be flat until the Pythagoras came along to cause trouble. I could also refer to ancient Indians (prior to the year 300), American aboriginal traditions, or China up until the 17th century. These were smart people, philosophers and scientists, and they wrote about the world being flat all the time.

I could reference 19th century literature by the likes of Washington Irving, whose romanticized history of Christopher Columbus includes the idea that 15th century Europeans thought he would fall off the edge of the Earth.  Or, I could write about the work of Samuel Rowbotham, whose “scientific” work in Zetetic Astronomy proved the Earth is flat in 1849. Moving to the modern age, I can reference the Flat Earth Society, which has been advocating for a flat Earth model since the days of Sputnik. Finally, I can top it off with 21st century author Thomas Friedman, by taking his best-selling book’s metaphor completely out of context.

Maybe I’ve convinced you. Maybe I haven’t. Now it’s time to fire up the social media machine. I start tweeting about the Earth being flat. I post conspiracy theories on Facebook, and make catchy memes about it. People tell me I’m crazy. They start arguing in the comments. They bait the troll. I shoot back. Now, I start focusing on the buzz. People are talking about whether the Earth is flat. Look at all these conversations on the Internet about the flat Earth. Why is big media assuming that the Earth is round? Where’s our equal time? Where’s our fair and balanced?

At this point, it’s time to discredit our own strategy. Anyone can put anything on the Internet. You can’t trust what you read online. We’ve been burned so many times by misleading and biased content that we’re quick to agree with the cynical view that everyone has an agenda. Everyone is against us. Those fact checkers who say the Earth is round? They have an agenda. They’re out to get us. The impartial media? They’re not so impartial after all. They only tell one side of the story. This so-called science that proves the Earth is round? Well, we all know what they say about statistics. You can make the numbers say anything you want.

Now, this is the part that’s new. It’s time to change the story. Many people believe the world is flat. Lots of people are talking about the flat Earth. The news reports the facts. The politicians cite the facts. The fact-checkers check the facts. But the facts have changed. Did you catch the subtle shift? People are talking. That is a fact. People believe. That is a fact. This politician said. That is a fact. The Earth is flat. It doesn’t matter if that’s a fact. It’s just the object of the talking and believing and feeling. So you can say things like “Lots of people think the world is flat” and “Flat Earth proponents feel like they’re underrepresented in media.” Both of those statements are true. But that doesn’t mean they’re going to fall of the edge of the planet.

Distinguishing between fact and opinion is a lot harder than it used to be. We have to teach our children (and our parents, and our peers) to recognize those triggers of “feel”, “believe”, and “think”. Opinions are valuable. Beliefs matter. They shape our view of the world, and our actions in it. But people can be wrong. If one wrong person convinces 99 others, then we have 100 wrong people. The fact that there are 100 of them doesn’t make them less wrong, even if they feel like they’re not being heard. It’s a lot easier now for a few people to use “feelings” to mislead others. Part of being an informed digital citizen is recognizing when that’s being done to us.

 


Post script: Did you notice that almost all of those links about the flat Earth go to the SAME Wikipedia article? The links may make the text look more reliable, but cited sources are only as good as the person checking them.

Photo credit: JooJoo41 on Pixabay.

 

A More Perfect History

Last week, the College Board released a new version of the AP U.S. History Course and Exam Description. This document, last revised in 2014, outlines the content that should constitute an Advanced Placement American History course. Ideally, students taking this course pass the exam at the end of the year that entitles them to college credit for their achievement.

7522707282_46e00dc43e_zThe United States does not have a national curriculum for American History. The Common Core standards, an effort to unify the curriculum taught in American schools, only cover reading and math. The AP guidelines are the closest thing we have to a national standard for how this subject should be approached in high school.

The new standards come a year after strong opposition to the 2014 version. That revision emphasized comprehension, interpretation , and synthesis of history instead of merely recalling the names and dates of important milestones. Critics claimed that it undermined the idea of American exceptionalism, and fostered a view of American history that is too negative and political. Several states moved to ban the course from being taught in their schools.

In academic circles, we call this shift toward analysis, synthesis, and application an increase in academic rigor. As we continue to move into the age of information abundance, it becomes increasingly important for students to evaluate the information they’re getting, make connections among content from diverse sources, assess bias and frame of reference, and draw their own conclusions. They apply this deeper understanding  of history in new contexts, ostensibly to keep from repeating it.

Unfortunately, some of those conclusions don’t necessarily paint the United States in a positive light. After looking at the facts, one might conclude, for example, that the Boston Tea Party was actually an act of terrorism. Or, maybe, the strained relations between Europeans and native tribes had more to do with the Europeans dismissing them as savages, taking and destroying their resources, and constantly breaking treaties than with the natives acting unreasonably hostile toward white settlers. It’s quite possible that rounding up Japanese Americans, most of whom were United States citizens, and locking them up in interment camps after confiscating their homes and property was a heinous violation of their civil and human rights. One might conclude that detaining 780 people in the aftermath of 9/11 without charge or trial, and then systematically torturing them  over the course of a decade poses a stark contrast to the certain inalienable rights endowed to them by their creator.

Fortunately, the new version of the course re-instills those patriotic American ideals that make our citizens believe that this is the greatest country in the history of the world. Our nation is founded on the ideals of liberty, citizenship, and self-governance. Just don’t get too caught up in that definition of “liberty,” and be careful about that “self-governance” thing if you’re black or female or poor. George Washington, Benjamin Franklin, and Thomas Jefferson are fearless leaders to be revered, and have more than earned their places on our currency. Let’s set aside Washington’s blundering that would have lost the revolutionary war if the French hadn’t conveniently saved the day, Franklin’s inability to keep his hands to himself, and Jefferson’s substantial bi-racial posterity. The Declaration of Independence and the Constitution should be revered as sacred documents, unless you take that bit about being created equal too seriously, or unless the unelected Supreme Court issues a ruling you don’t agree with. We were certainly the determining factors in ending both world wars, and the U.S. is the only country who realized that the cold war could be ended by simply telling Mr. Gorbachev to “tear down this wall.” Let’s conveniently omit the fact that the United States, 70 years later, is still the only country to have actually used a nuclear weapon. Don’t get too caught up in the details. We’re awesome, and we know it.

A generation ago, I took this AP American History course. We skipped most of the dates and facts. The textbook spent most of the year in the bottom of my locker. The units focused on essential questions that were primarily answered through the examination of primary sources. We learned to interpret history for ourselves. We learned to assess bias. We learned about different kinds of oral and written accounts, and how to determine why they were created, by whom, and when. One of the units focused on the cause of the civil war. Slavery was a contributing factor. But it wasn’t the only factor, and it probably wasn’t the driving force. Slavery as a human rights issue was certainly not as important as slavery as an economic issue. But we didn’t blindly read an over-processed, committee-driven, negotiated account in a textbook about why there was a civil war. We explored the topic ourselves.

We didn’t cover most of the course. We glossed over almost all of the dates and names. I don’t think the teacher was overly concerned with our exam scores. In fact, we didn’t have any assessments or grading at all, apart from the final exam. We were intrinsically motivated, and the subject was made interesting by the approach taken by the teacher. It was certainly a time before high-stakes accountability.

I scored well enough to earn six college credits and was exempted from taking Western Civilization as a college Freshman. I don’t remember much about the exam, except that in the essay, I argued that affirmative action programs were discriminatory. I’m pretty sure I criticized Lincoln in the same essay.

I love my country. There’s video and photos all over the Internet of me waving flags and singing patriotic songs. I know most of the words to the Pledge of Allegiance (even though I think it’s a really creepy nod to fascism). I sing the words to the Armed Forces Medley and Stars and Stripes Forever every time I hear them. But I think our country can be better. There’s lots of room for improvement. And we don’t get better by ignoring the inconvenient misdeeds of our past.  Our students need to study all of American history, not just the parts that make us look good. They need to draw conclusions, identify and acknowledge misdeeds, and resolve to prevent their leaders from walking down those same paths.

Maybe that’s what the critics are afraid of.