Moving at the Speed of Creativity by Wesley Fryer

Rights and Responsibilities in an Age of Surveillance

As surveillance becomes more ubiquitous and “normalized” in our society, I’m wondering if the parable of the boiling frog may apply and how these dynamics should fit into conversations at school with students, teachers and parents about digital citizenship. I’m not arguing we should abandon all the trappings of 21st century modernity and head up to a retreat in the mountains to live entirely off-grid. That has short term appeal for vacation times, but I’m definitely not a technology Luddite. In this post I’ll elaborate on why it’s important we talk about surveillance, data security and information sharing with students, and provide resources for additional learning, conversations and advocacy about these topics.

Internet Surveillance by Mike Licht, NotionsCapital.com, on Flickr
Internet Surveillance” (CC BY 2.0) by Mike Licht

About a year ago in November 2016, I shared my second TEDx talk. This time my theme was “Digital Citizenship in the Surveillance State.” The presentation synthesized and reflected on a variety of articles, blog posts, and videos which Jason Neiffer (@techsavvyteach) and I had discussed the previous year on our weekly webshow and podcast, “The EdTech Situation Room.” (@edtechSR) All those resources remained linked on edtechsr.com/nsa. Of the collection, the top article I recommend you read thoroughly (and then discuss with others) is “If You’re Not Paranoid, You’re Crazy” by Walter Kirn (@walterkirn). It was published in Wired Magazine in November 2015, and is still very prescient as well as eye opening two years later.

The parable of the boiling frog is the story of a frog who is put into a pot on the stove filled with room temperature water. When the stove burner is turned on, the frog is allegedly able to gradually adjust its body temperature to the warming water, but uses valuable energy in the process. As a result, when the water gets too hot and is about to boil, the frog finds that it can’t jump out any longer.

frog in a pot 5 by jronaldlee, on Flickr
frog in a pot 5” (CC BY 2.0) by jronaldlee

A common interpretation of this parable (used by TEDx presenters and Al Gore in his first documentary on climate change, among others) is that the boiling water did not kill the frog, but rather its indecision or inability to decide to jump out of the pot when things started initially warming up. The analogy to our modern surveillance state is that we may have more agency and options to resist in these formative years, than we will in subsequent decades when levels of surveillance are even higher and more intrusive than they are today.

Before advancing this metaphor further, it’s ironic and important to note (especially from the perspectives of digital literacy, media literacy, and digital citizenship) that “the parable of the boiled frog” is considered a myth or legend. The story reportedly comes from experiments performed in the 19th century, but modern re-creations of the experiment consistently observe frogs jumping out of gradually warming pots of water, and totally unable to jump out of already boiling pots. See these articles from The University of Washington’s Conservation magazine in 2011, this 2013 Forbes Op-Ed, Gizmodo in 2014, and The Atlantic in 2006 for citations on this myth busting.

Critical thinking and the ability to verify the validity of any claim remain important today as they always have been, but we arguably face greater challenges because of the sheer volume of information presented and available to us daily. Remembering the mythical status of the frog boiling parable, it’s valuable to consider its lesson and apply it to our modern surveillance state.

Surveillance and perceived surveillance is on the rise in our society and culture.

According to a PEW Research Center study conducted in February 2017,

Seven-in-ten U.S. adults say it is at least somewhat likely that their own phone calls and emails are being monitored by the government, including 37% who believe that this type of surveillance is “very likely,”

According to Forbes magazine in September 2017, a Latin American country (most likely Mexico) recently purchased “Signalling System No. 7 (SS7)” technology from an Israeli company for $5 million. The system utilizes ULIN (ULtimate LIstening) technology, which permits a user (ostensibly limited to nation state security forces) to intercept and monitor any cell phone’s communications as well as location using only the phone number or phone’s IMEI number. This two minute marketing video summarizes the capability. This July 2017 Cyberscoop article and May 2016 Forbes article provide more background about the Israeli company behind ULIN, Ability Inc.

News headlines, articles and videos like these highlight the fear and concern which many in our society have over the rising power of the surveillance state. Moore’s Law and the continuing exponential advancement of computing and telecommunications capabilities are (arguably) inexorably drawing us as a culture closer to the singularity. While transformative and unprecedented global digital sharing is empowered by these technological changes, the rise of the surveillance state and manipulative corporations like Cambridge Analytica (which allegedly was a prime mover supporting the electoral surprises of Brexit in June 2016 and the Trump presidential victory in November 2016) are also empowered entities in our culture landscape which we can ill afford to ignore.

This brings us to the oft cited slogan from Spiderman, “With great power comes great responsibility.”

With Great Power Comes Great Responsibil by Wesley Fryer, on Flickr
With Great Power Comes Great Responsibility” (CC BY 2.0) by Wesley Fryer

Applied to our current context in a rising surveillance state, what are our rights and responsibilities as citizens, voters, educators, and parents? I attempted to articulate this question in a conference proposal I submitted last night for the April 2018 ATLIS conference in, Washington D.C., which I titled, “Rights and Responsibilities in an Age of Surveillance.” The under 500 word description is:

How should our current climate of surveillance, scrutiny and data sharing affect lessons we share with students about citizenship and digital citizenship? How can our decisions about the information we choose to share electronically affect our ability to travel, our job prospects, and our daily lives? What obligations do we have to advocate or and defend privacy rights? What responsibilities as well as rights do we have as citizens in our age of surveillance?

“Learning goals” for session are to:

  1. Encourage participants to expand their notions of “digital citizenship” to include rights as well as responsibilities
  2. Highlight the importance of discussing ethics and values with students as surveillance, data sharing, and malicious hacks of organizations are normalized

Additional supporting sentences explain:

This session will present current / timely statistics, events, and stories relating to privacy, surveillance, data sharing as well as hacking. The facilitator will encourage participants to reflect and share perceptions about these topics as they relate to students and digital citizenship education. This session will highlight the importance of discussing info security, identity theft, & surveillance in the classroom in practical ways, engaging students in discussions about ethics & values.

In addition to the possibility of sharing this session at ATLIS in April, I’m scheduled to keynote the Ohio state educational technology conference in February 2018 on the topic of digital citizenship, especially as it applies to privacy, data, and surveillance. As I continue to wrestle with these issues and consider how they can and should translate into conversations about digital citizenship with students, teachers, and parents, here are a few of my conclusions.

We need to move beyond fear

While it’s easy to share “Chicken Little” / “number of the beast” / SkyNet themed articles that lead to excitement, hand-wringing and worry, we need to distill advancing technological capabilities apart from the underlying values and policy issues which surround them. In addition to discussing governmental policies, we need to focus on our own personal decisions about adopted technologies and platforms on our phones, tablets, laptops and homes.

Personal Information Sharing with Awareness and Intention

How much information will we choose to share openly and give away? How will and should our knowledge about the ways our collected personal information is being used by corporations and political entities to influence our purchasing as well as voting decisions affect what we share? The “Note to Self” (@notetoself) Privacy Paradox project highlighted a number of practical resources which can help us individually answer these questions, which are summarized in the Privacy Paradox Tip Sheet. ProPublia’s episode “What Facebook Knows About You” from their “Breaking the Black Box” series was one of the most eye-opening resources I learned from NoteToSelf in the past year. It features the free Google Chrome extension, “What Facebook Thinks You Like,” and my cursory use of it:

  1. Changed how I use Facebook (I “like” far fewer places and organizations, even as Facebook gets more aggressive and persistent in asking me to when I tag a location)
  2. Changed what I’ve saved in Facebook as “likes” in the past (I deleted almost all my past “likes” after seeing what this tool revealed about Facebook and its opinions of me)

In addition to sharing on social media with awareness and intentionality, it’s also vital we understand why sharing our cell phone number and/or email address publicly online is a very bad idea. Adults familiar with the way credit agencies work understand the importance of safeguarding social security numbers. Now, however, for many people their cell number has become just as valuable an identifier. Sharing your cell phone number and/or email address openly can:

  1. Open you up to receive an even higher volume of spam calls (robocalls) and text message spam
  2. Open you up to receive more spam email, including phishing emails which contain malware or lead to the theft of your own identity
  3. Allow advertisers and companies aggregating private information for resale the ability to “connect the dots” (correctly or falsely) with various pieces of personal information you’ve shared previously.

Whenever a checkout cashier asks a consumer to share your cell phone number or email address, this array of unwelcome outcomes and undesirable consequences should be front and center in each person’s mind. Unfortunately today that is not the case for most folks.

scream and shout by mdanys, on Flickr
scream and shout” (CC BY 2.0) by mdanys

IoT (Internet of Things) Adoption at Home

Do you have a personal assistant in your home? On your smartphone? I have an iPhone and iPad, so I have Siri and use her everyday for different tasks. Amazon now has 8 different versions in their Amazon Echo product line, featuring Alexa. Google has a product announcement event next week, and we’re sure to hear more about improvements to Google Home, which I’ve used a few times thanks to its free iOS version. The race is on between Amazon, Google, Apple and Microsoft to develop and dominate the market of artificial intelligence (AI) powered smart assistance for home, work and life. The development of these technologies relies heavily on the quantity of information which is fed into each proprietary system. The decisions we make individually and as a society among these AI players will potentially help choose winners and losers from an economic standpoint, but what price will we pay from the standpoints of security and privacy?

Advocacy for the Right to Privacy

We need everyone to understand privacy is an essential and unalienable right.* Saying “It’s OK if the government monitors and records all my phone calls, texts, emails, and other communications because I’m not a terrorist or engaging in criminal behavior” metaphorically lays another pavestone on the road to authoritarianism and tyranny. Every civil rights movement from history, including the fight against slavery, the campaign for women’s suffrage, the civil rights movement of the 1960s in the United States, all required that leaders (whose cause was eventually advanced / championed) have confidental conversations beyond the suspicious and prying ears of government officials.

* Breitbart, a source I’ve never linked or cited here until now, offers clarification on “unalienable” as given by God instead of by government. Breitbart’s ties to the Trump administration and aforementioned / related Cambridge Analytica campaign activities in 2016 should be common knowledge among voters and citizens today, but unfortunately still remain mostly hidden because of a lack of mainstream media coverage.

Advocacy for Open AI

As previously mentioned, a multi-billion dollar struggle for dominance over the market for artificial intelligence (AI) is underway. Amazon’s latest salvo was last week. Companies want us to “feed their machine” with tons of data, and promise greater convenience as well as technological functionality in our devices and homes as a reward. Consumer benefits are not all that is on the mind of companies developing and integrating AI, however. AI and machine learning are playing an increasingly powerful role in technologies like facial recognition and video surveillance, however, as Bloomberg’s article from September 28, 2017, reveals, “Moscow Deploys Facial Recognition to Spy on Citizens in Streets.” The article cites a 2013 report revealing at that time (4 years ago) London had 70,000 public security cameras in use. The article claims by contrast today, Moscow utilizes a “network of 170,000 surveillance cameras…” making it the world’s largest “centralized surveillance network.” Human eyes are far too slow to interpret the massive quantity of information recorded and archived by these systems. AI and machine learning, focused on facial recognition technologies similar to those utilized in the newly announced iPhone 8, are the algorithmic keys opening the door ever-wider to a ubiquitous surveillance state.

Caméra de vidéo-surveillance by zigazou76, on Flickr
Caméra de vidéo-surveillance” (CC BY 2.0) by zigazou76

In July 2017, Elon Musk @elonmusk) reiterated his common theme that proprietary, unregulated artificial intelligence (AI) poses a bigger risk to our civilization than anything else. As the founder of PayPal and current CEO of Tesla, SpaceX and SolarCity, Musk is a technological and economic mastermind whose opinions should not be cast aside offhandedly without careful consideration. Musk has helped co-found the Open AI research initiative, and is committed to ensuring access to powerful AI is not limited to corporations or governments, whether they be legitimate (internationally recognized) or non-state / rogue actors.

If smart people like Elon Musk think AI poses the greatest risk to our civilization, beyond nuclear proliferation or climate change, shouldn’t we be discussing this with students today? Policy and policymakers will always stay behind the coders and the latest technologies, but we need informed leaders as well as voters to help direct our shared political, economic and cultural course. Advocacy as well as financial support for organizations promoting transparency and openness when it comes to AI technologies should be highlighted and discussed as part of digital citizenship conversations.

Financial Commitment to Support the Defenders

Lobbying groups are powerful around the world, but especially in the United States. The role of non-governmental advocacy organizations like Amnesty International and Greenpeace has been consequential in many situations. Note to Self’s Privacy Paradox’s Tip Sheet identifies the Electronic Privacy Information Center, the World Wide Web Foundation, and Access Now as non-partisan groups working on privacy issues with different leaders in government as well as partner organizations. The Electronic Frontier Foundation (@eff) is an organization working to not only highlight issues related to state surveillance and advocate for victims, but also pro-actively empower others with “Surveillance Self-Defense” tools and resources. The work and resources of these groups should be discussed as the rights and responsibilities of digital citizens are highlighted in school.

Anything Can Be Hacked

Just as there is no such thing as “complete security” for the physical security of a home or other building, total security does not exist for anything in the digital world either. Security consultants and companies often talk about “defense in depth” security strategies, which attempt to provide multiple layers of defense against both external and internal security threats to hopefully prevent malicious actors from harming people, things or information, and facilitating the identification and capture of those who are successful in their malicious acts. The recent Equifax hack which affected approximately half of all U.S. consumers is the latest reminder that as we increasingly utilize digital resources for commerce, work and leisure,  we are increasingly at risk. As citizens we need to not only be aware of these risks, but also knowledgable about proactive steps we can take to protect ourselves and our organizations. If we experience a hack or identity theft, we need to know trusted organizations, individuals and information sources so we can seek constructive advice about how to respond and proceed after an attack.

Borderline Biennale 2011 - Hacking/TAZ/U by Abode of Chaos, on Flickr
Borderline Biennale 2011 – Hacking/TAZ/U” (CC BY 2.0) by Abode of Chaos

Ignorance and Appeal of Inaccurate Personal Information in Private Databases

The collection of personal information by different groups for diverse purposes poses a variety of concerns. These include:

  1. Search engines and other companies selling that data to advertisers
  2. Nation states attempting to influence election results in other countries (Russia)
  3. Hackers engaging in bank fraud, identity theft and extortion schemes via ransomware

One of the big issues we face as citizens in this environment is digital opacity: In most countries, citizens do not have any rights to know the information companies have collected about them online, or a procedure to contest inaccurate information. Credit Bureaus operate in different countries (including, of course, the United States) and have provisions for consumers to obtain copies of their credit reports as well as contest incorrect data. Search engines like Google or social media companies like Facebook do not provide this type of transparency or proactive consumer options to change information in these corporate databases.

Some academics (@holden) are calling for taxation on the storage and collection of personal information. While that sounds like a radical departure from our personal digital information status quo and unlikely to happen in the near term, it is indisputable that our personal information is valuable to hackers as well as advertisers and those wanting to influence our political perceptions, and thereby influence our behavior at the ballot box. The opacity with which companies and other groups (licit and illicit) can gather and sell our information needs to be addressed and should certainly be discussed with students.

Customs Information Seizure and Travel Habits

Starting in mid-October 2017, “the Department of Homeland Security (DHS) plans to collect social and internet search data on U.S. immigrants, including naturalized citizens and those with a green card.” Customs agents in the United States and in other countries have, at times, reportedly demanded that different travelers give up the passcode to their smartphone before being granted access to enter another country, and even passwords to social media accounts. Customs officials (including those in the U.S.) have broad powers of search and seizure, and some authorities recommend deleting (wiping) all data from your smartphone before crossing an international border as the most viable strategy for protecting personal information contained on it.

Again, many people likely think “I’m not a criminal, I’m not a terrorist, so I have nothing to hide” and might not think twice about having the entire contents of their smartphone copied by customs agents. As previously explained, however, ANY computer system is hackable. That means even if you have faith in the goodwill and benevolent intentions of the country whose customs agents are copying your smartphone’s contents, there is no guarantee that country will securely maintain control over that data forever into the future. It’s definitely possible your personal information (all of it from your smartphone) could be part of a malicious hack, and your information could therefore wind up literally in the hands of anyone who wants it and is willing to pay or hack for it. The EFF’s March 2017 publication, “Digital Privacy at the U.S. Border: Protecting the Data On Your Devices and In the Cloud,”  includes a variety of specific recommendations and tips for international travelers. Depending on the context of your school, your students may or may not be engaging in international travel today. For those who are or will at some point, the potential consequences of traveling over international borders with a “fully loaded smartphone” (from a personal information standpoint) can be big. As I plan for international travel to Egypt in two months, these are definitely issues on my mind.

Staying Informed: Personal Agency over our News Feeds

Opacity is not just common when it comes to the information companies gather about our online and offline behavior. Opacity also characterizes “the news feed” in Facebook. Coders working for Mark Zuckerberg now wield tremendous global power because they control the secret algorithms which determine what posts you see and what posts are hidden every time you open Facebook.

This is a big topic and so many warrant its own post down the road, but I’ll summarize my relevant contention here: As critical thinkers following in the footsteps of pre-Internet scholar and media critic Neil Postman, we should each assume greater agency over our personal information feeds. Rather than acquiesce to be “the sheep of Facebook” passively accepting whatever articles its wizards behind the coding curtain determine we should see, instead WE should be active agents creating and tweaking our own algorithms for filtering and presenting information to ourselves. See my presentation resources for “Discovering Useful Ideas” for specifics on this using tools like Flipboard and Nuzzle,Twitter lists, and other strategies.

If you’ve stayed with me for the duration of this lengthy post, thanks and congratulations. These issues are not simple and do not fit into a single sound byte, but they ARE of vital importance to our personal security as well as our free society. If you have feedback or other related thoughts you’d like to share with me, please reach out on Twitter (@wfryer) or with a comment below. We live in an incredible age (some might even term “magical”) which offers tremendous opportunities, but it also poses great risks which impart responsibilities to us as citizens and leaders. It’s my firm conviction that we need to be discussing more of these issues with other teachers, our students, and our parents at school.

If you enjoyed this post and found it useful, subscribe to Wes’ free newsletter. Check out Wes’ video tutorial library, “Playing with Media.” Information about more ways to learn with Dr. Wesley Fryer are available on wesfryer.com/after.

On this day..