Going forward, I'm going to switch from doing the round-ups (although I will continue to use #datamongering as a hashtag at Twitter), and instead I will be writing up my notes on Shoshana Zuboff's latest book, Surveillance Capitalism. I read the book earlier this summer and I thought it was remarkable. I wish every educator would read this book, but it's a book (a big book...) and it is not directly focused on education, so I can also understand that it won't be on everybody's reading list.
So, I'm going to read the book a second time now, sharing my notes chapter by chapter here, commenting from my perspective as a long-time online educator. I know it will be helpful for me to record my thoughts about the book here, and I hope it might be useful to others! For today, here is Chapter 1, where I see a close parallel between the evolution of the "smart home" that Zuboff presents right here in the introduction and the changes we now see coming to the LMS as it shifts from providing us with webspace to extracting our data. Thoughts? You can comment here or connect at Twitter. :-) @OnlineCrsLady
So, I'm going to read the book a second time now, sharing my notes chapter by chapter here, commenting from my perspective as a long-time online educator. I know it will be helpful for me to record my thoughts about the book here, and I hope it might be useful to others! For today, here is Chapter 1, where I see a close parallel between the evolution of the "smart home" that Zuboff presents right here in the introduction and the changes we now see coming to the LMS as it shifts from providing us with webspace to extracting our data. Thoughts? You can comment here or connect at Twitter. :-) @OnlineCrsLady
Chapter 1. Home or Exile in the Digital Future
Zuboff starts with a series of definitions of surveillance capitalism; you can see the elements I highlighted of greatest concern to me:
That is the definition, and in this introductory chapter, Zuboff argues that what is at stake is the future of humanity itself:
Just as industrial civilization flourished at the expense of nature and now threatens to cost us the Earth, an information civilization shaped by surveillance capitalism and its new instrumentarian power will thrive at the expense of human nature and will threaten to cost us our humanity.
To begin her story of surveillance capitalism, Zuboff goes back not so long ago, to the year 2000 (about the same time that the LMSes were beginning to take shape), when researchers at Georgia Tech created an experiment called "Aware Home," a data-driven symbiosis of people and their home. Yes, there was data, lots of it, but our assumptions about that data were different back in the year 2000:
It was assumed that the rights to that new knowledge and the power to use it to improve one’s life would belong exclusively to the people who live in the house. [...] All of this was expressed in the engineering plan. It emphasized trust, simplicity, the sovereignty of the individual, and the inviolability of the home as a private domain.
Zuboff then contrasts that with the smart-home technologies that exist today, 20 years later. Instead of giving the occupants control over the distribution of the information gathered about them in their home, that data is extracted by companies like Google to use in the development of their predictive products.
I see the same contrast between the LMS of the year 2000 and the new LMS of today. In the past, the LMS gathered data, yes, but it limited itself to sharing that data back with students, teachers, and schools for us to use for our own purposes. Now, ed tech companies like Instructure are gathering data about us, amassing that data in huge databases, and then using that data to power their own machine learning experiments. It's a huge shift, and it has taken place just over the past two decades.
Here is how Zuboff describes the shift with reference to the "Aware Home," but her observations apply just as well to the LMS I think:
In the year 2000 this vision naturally assumed an unwavering commitment to the privacy of individual experience: should an individual choose to render her experience digitally, then she would exercise exclusive rights to the knowledge garnered from such data, as well as exclusive rights to decide how such knowledge might be put to use. [...] Today these rights to privacy, knowledge, and application have been usurped by a bold market venture powered by unilateral claims to others’ experience and the knowledge that flows from it. Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data.
In my discussions with people who work at Instructure, I have had a hard time getting them to understand that my concern is not just about privacy; instead, it's about the unilateral claims they are making about the use of that data, just as Zuboff describes here. Even if they successfully "de-identify" my data, that is still my data, data about me and my life, and I maintain that I should have the right to to decide how my data will be used.
Of course, for Instructure, this new use of our data is a business imperative; surveillance capitalism is a force that is sweeping through their entire industry. As their new CEO has stated, it is "incumbent" upon them to exploit the market value of the data they have extracted in order to develop "predictive modeling using AI and ML" (Instructure investor conference, March 2019). That is just what Zuboff describes for all these new data-based businesses:
The competitive dynamics of these new markets drive surveillance capitalists to acquire ever-more-predictive sources of behavioral surplus: our voices, personalities, and emotions.
Companies like Instructure extract data about us not simply to record and monitor our behaviors, but also to predict and control what we do, turning behavioral data into a commercial product. This is what Zuboff calls instrumentarian power:
Instrumentarian power knows and shapes human behavior toward others’ ends. [...] Just as industrial capitalism was driven to the continuous intensification of the means of production, so surveillance capitalists and their market players are now locked into the continuous intensification of the means of behavioral modification and the gathering might of instrumentarian power.
How did this "gathering might" happen without us noticing? Part of the problem is that this is something truly unprecedented; we could not see what was coming because we had never seen anything quite like it before. As Zuboff explains:
When we encounter something unprecedented, we automatically interpret it through the lenses of familiar categories, thereby rendering invisible precisely that which is unprecedented.
That is what happened to me: because I continued to interpret the LMS in terms of familiar categories from the past, I failed to see what the LMS was becoming. Back when I started teaching online in 2002, the LMS was just a place where teachers put content for students to read and watch, a place where students could take quizzes or engage in discussions, a place to record the grades. That's all. There was nothing really remarkable about that old LMS; it was just an online reproduction, awkward and clunky, of what we already did in the classroom. In our classroom.
But now, Instructure has decided to invade that classroom, gathering up every bit of data that it can about what happens there, using that data to expand its business (doubling its potential market, as Goldsmith has claimed) and also to alter what is happening in the classroom itself, "personalizing" student learning based on Instructure's own predictive models and pedagogical assumptions:
But now, Instructure has decided to invade that classroom, gathering up every bit of data that it can about what happens there, using that data to expand its business (doubling its potential market, as Goldsmith has claimed) and also to alter what is happening in the classroom itself, "personalizing" student learning based on Instructure's own predictive models and pedagogical assumptions:
Much of this new work is accomplished under the banner of “personalization,” a camouflage for aggressive extraction operations that mine the intimate depths of everyday life.
This so-called personalization is actually de-personalization, a process of automation that alienates teachers and students, and even alienates students from themselves, as machine-generated algorithms claim to know more about us than we do ourselves. Intruding into our classrooms, these algorithms threaten what Zuboff calls our right to the future tense:
Finally, I consider surveillance capitalism’s operations as a challenge to the elemental right to the future tense, which accounts for the individual’s ability to imagine, intend, promise, and construct a future.As an educator, I really connect with these words — our ability to imagine, intend, promise, and construct a future. To me, that is what education is all about, and that is why I object to the use of my data, de-identified or not, in Instructure's attempt to predict that future, and control it, with their algorithms.
This theme of our right to the future is something Zuboff will develop in great detail throughout the book, and it figures already in the book's subtitle: The Fight for a Human Future at the New Frontier of Power.
So, for now, these are my notes on the first chapter, and I'll be back next week with Chapter 2; I'll tag these posts as a group: Zuboff.
UPDATE: Here are the notes on Chapter 2. Setting the Stage for Surveillance Capitalism
P.S. For those of you interested in reading the book, I can highly recommend the audiobook too; the reader is excellent! That's actually another reason why I am keen on reading the book a second time and sharing my notes; I really enjoyed listening to the audiobook for the impact it had on me, but audiobooks don't lend themselves to note-taking.