Ubiquitous social media platforms—including Facebook, Twitter and Instagram—have created a venue for people to share and connect with others. We use these services by clicking “I Agree” on Terms of Service screens, trading off some of our private and personal data for seemingly free services. While these services say data collection helps create a better user experience, that data is also potentially exploitable.
The news about how third parties obtain and use Facebook users’ data to wage political campaigns and the mounting evidence of election interference have shined a spotlight on just how secure our data is when we share online. Educating youth about data security can fall under the larger umbrella of digital citizenship, such as social media uses and misuses and learning how not to embarrass or endanger oneself while using the internet. But few resources compare to actually experiencing a data and privacy breach.
To ensure that students learn about online privacy and data security, high school English language arts teachers John Fallon in Connecticut and Paul Darvasi (who also reports for MindShift) in Toronto co-created Blind Protocol, an alternate reality game. ARGs blend fiction with the real world by creating narratives and puzzles that take participants deeper into the story by way of their actions. Fallon and Darvasi’s ARG goal was not to inform students on how to actually hack or spy; rather, they use game tactics to teach about the vulnerability of their data.
“Every decision and click you make is being recorded and scraped by somebody who doesn’t have your privacy and interests at heart,” Fallon says to his students. “Think carefully about whether you want your cookie crumbs to be spread.”
HOW ALTERNATE REALITY BEGINS
The ARG unit starts with the viewing of several privacy-focused films, including the Edward Snowden documentary “Citizenfour,” PBS Frontline’s “The United States of Secrets,” which is about the National Security Administration, and the film “Terms and Conditions May Apply.”
When the teachers are ready to begin the ARG — Fallon in Connecticut with his Fairfield Country Day School students and Darvasi in Toronto with his Royal St. George’s College pupils — students start out by viewing a TED Talk about online privacy and data surveillance. (The two classes are experiencing the ARG separately and the students are unaware of each other’s existence, until they eventually interact halfway through the four-week unit.)
“All of a sudden, I get a phone call,” Darvasi said. Fallon gets the same fake phone call, too, as each follows the same setup. Each teacher then steps outside his classroom, leaving the students alone. Then the video restarts, seemingly gets hacked and a voice urges students to check their email. Students then find an email from a mysterious entity named HORUS that has an email with the school domain address. The message from HORUS contains a video message with instructions for the ARG.
Students are then given a series of clues that unlock more clues as the game progresses. For example, clues in the email lead students to four canopic jars containing USB drives. Details on the jars unlock access to the contents of the password-protected USB drives. The clues within the drives lead students to a game manual buried somewhere on campus that allows them to unlock more clues.
In the second week, students come up with user profiles on a PDF that include four details — a self-selected image, nickname, symbol and motto — and turn them into their teacher, who acts as a conduit for HORUS. Several days later, much to their shock, according to the teachers, the students find a stash of profiles delivered by HORUS that include photos, nicknames, symbols and mottos — but the profiles are not their own. They are surprised to discover that, somewhere else in the world, HORUS has clearly led another group of students through the same steps. The questions is: Who are they and where are they?
The students’ game goal is to uncover the location and identities of their newly discovered counterparts. The process of uncovering this data is the win condition of the game, and the central mechanic that drives student engagement and learning.
“John and I play dumb,” said Darvasi, who said it’s up to the students to solve the game while the teachers act as intermediaries. “We tell the students we know a little more than you do. Obviously, they know we’re pulling the wool over their eyes and we’re in on it, but they still happily play along.”
In the process of uncovering data about the other students with four details and additional tools, students learn about how much data people, especially teens, reveal about themselves online and how little information it takes to identify someone.
Through an additional series of clues, students are led to another important tool to unlock the game: a catalog of 20 protocols. Inspired by the NSA ANT catalog that detailed the types of protocols that can be launched against a target for cyber surveillance (with names such as GOPHERSET and COTTONMOUTH-1), Darvasi and Fallon created their own catalog from which students can purchase protocols with faux cryptocurrency they’re given at the start of the game. No student has enough to buy a protocol on their own, so students have to pool their money and make selections strategically as a group.
For example, Darvasi’s students in Toronto can pool together 55 faux bitcoins to purchase and launch the BOTTING protocol against an opponent. The student targeted at Fallon’s school in Connecticut would then have 48 hours to record audio of 10 words of Darvasi’s students choosing and send it back to them through an intermediary (Darvasi or Fallon). For a higher price of 65 faux bitcoins, students can launch MORPHLING, which would give the opponent 48 hours to record a one-minute video explaining three ways to stay safe while using Facebook, while making their school mascot (or a close approximation of) appear in the video in some way during the entire minute.
Ultimately, the students on the receiving end of the protocol are trying to comply with the request while revealing as little information as possible. The goal is to avoid having their true identities revealed.
In an example of how snippets of data can reveal a bigger picture, students launched a desktop protocol, in which the opponent is required to take a screenshot of their own computer desktop. The student whose screenshot was submitted left his first name on one file and last name on another document that was visible. Opponents searched for that student’s name and identified their Facebook profile — where he was wearing his school colors — and won.
MAKING LEARNING REAL
Running the game with two different groups imbues students with the sensation of online vulnerability without actually putting anyone’s real-life data at risk. The two teachers run the game together, but are exploring playing with more classes around the world.
Ultimately, the teachers’ learning goal is to drive home a deeper understanding of what it takes to maintain good online security and privacy practices. More than how, students learn why they should be careful about what they post on social media. “Students learn why they must change passwords, and why they should be careful about their digital footprints,” Fallon said.
Fallon and Darvasi carefully mediate the entire experience, pulling the game’s strings and levers in the background, as students play in class. “The game is metaphorical, not real—but the impact is,” said Fallon, who now teaches at a different school. Students know they are in a game and that their actual identities are safe. “If a group of strangers from another country only needed a street sign and your school colors to figure out where you are, think about how vulnerable you are online.”