Pages

October 27, 2019

Zuboff, Chapter 7: Reality Business / School Business

Yep, it's been a month (ugh, not a great month), but I realized during the Can*Innovate conference last week (I presented on randomizers... UN-prediction!) that I need to get back to work on summarizing Zuboff. I know that most people are not going to read her book, but maybe they will read these notes and think more critically about not just the LMS but the way Learning Analytics is now driving the LMS. This week's chapter from Zuboff addresses that directly, too, since this is the chapter where she begins the discussion about behavior modification.

The fact that behaviorist assumptions run deep in education is a big reason why, I suspect, many people uncritically accept the claims of learning analytics and the behavior modification agenda that goes with them. For teachers who approach education as a behavior modification project, then learning analytics are just what they need. But for teachers who approach education with a belief in human freedom, we need to be aware of how the LMS constrains our students' freedom, and our freedom as teachers too.

And let there be no mistake about it: right now the emphasis is on learning analytics to monitor and control students, but that is just the beginning; there will be teaching analytics also to monitor and control teachers. So keep those thoughts in mind regarding the Internet of Things, which is the topic of Zuboff's seventh chapter: The Reality Business.


1. The Prediction Imperative

We've been reading more and more in the education news about the data that schools are not seeking to collect about their students in order to create better predictive algorithms. Not just attendance in class, but going to the library, etc. That is what this chapter is about: the need to gather more and more data to create new predictive products:
Even the most sophisticated process of converting behavioral surplus into products that accurately forecast the future is only as good as the raw material available. [...] Surveillance capitalists therefore must ask this: what forms of surplus enable the fabrication of prediction products that most reliably foretell the future? This question marks a critical turning point in the trial-and-error elaboration of surveillance capitalism. It crystallizes a second economic imperative—the prediction imperative—and reveals the intense pressure that it exerts on surveillance capitalist revenues. [...] Compelled to improve predictions, surveillance capitalists such as Google understood that they had to widen and diversify their extraction architectures to accommodate new sources of surplus and new supply operations.
Zuboff presents the data-gathering grab as two different processes: extension and depth. Extension is about reach:
Extension wants your bloodstream and your bed, your breakfast conversation, your commute, your run, your refrigerator, your parking space, your living room.
So, in education, that means not just what is happening in the classroom, but in the dorm room, the library, dining halls, etc.

Then, there is depth:
The idea here is that highly predictive, and therefore highly lucrative, behavioral surplus would be plumbed from intimate patterns of the self. These supply operations are aimed at your personality, moods, and emotions, your lies and vulnerabilities.
In this context, think about "sentiment analysis" and other data-mining that schools want to run on LMS discussion boards or students' social media, their Internet search history, etc. 

Beyond the data gathering, broad and deep, is the behavior modification; this is what Zuboff calls economies of action:
Behavioral surplus must be vast and varied, but the surest way to predict behavior is to intervene at its source and shape it. The processes invented to achieve this goal are what I call economies of action. [...] These interventions are designed to enhance certainty by doing things: they nudge, tune, herd, manipulate, and modify behavior in specific directions.
Yep, all the nudges. Educators will claim that they are seeking to modify student behaviors only in positive directions, for positive outcomes. That is why one of the main questions we all need to asking ourselves is how we see our role as educators. Is behavior modification at the heart of our teaching project? Or do we have other ideas about our roles as teachers? One good way to address that question is to ask yourself about how you would feel being continuously monitored and nudged to change your behavior as a teacher. For more on that, see this powerful new essay by Alfie Kohn: How Not to Get a Standing Ovation at a Teachers’ Conference.

2. The Tender Conquest of Unrestrained Animals

This section of the chapter is really eye-opening: Zuboff looks at the use of telemetry used by scientists to monitor animals and climate, gathering data that could never be collected in a zoo or replicated in a laboratory:
It was a time when scientists reckoned with the obstinacy of free-roaming animals and concluded that surveillance was the necessary price of knowledge. Locking these creatures in a zoo would only eliminate the very behavior that scientists wanted to study, but how were they to be surveilled? [...] The key principle was that his telematics operated outside an animal’s awareness.
One such scientist was R. Stuart MacKay
MacKay’s inventions enabled scientists to render animals as information even when they believed themselves to be free, wandering and resting, unaware of the incursion into their once-mysterious landscapes.  
One of the recurring themes throughout this chapter is the tension between scientific curiosity and capitalist exploitation:
MacKay yearned for discovery, but today’s “experimenters” yearn for certainty as they translate our lives into calculations. [...] Now, the un-self-conscious, easy freedom enjoyed by the human animal—the sense of being unrestrained that thrives in the mystery of distant places and intimate spaces—is simply friction on the path toward surveillance revenues.
That "easy freedom" is something that I am prepared to fight for, as an educator.

3. Human Herds

In this section, Zuboff focuses on work by Joseph Paradiso and his colleagues at the MIT Media Lab, with their quest to build something like a browser for reality itself, a browser not for an Internet of webpages but for that Internet of things... all the things. 
Just as browsers like Netscape first “gave us access to the mass of data contained on the internet, so will software browsers enable us to make sense of the flood of sensor data that is on the way.” [...] Paradiso is confident that “a proper interface to this artificial sensoria promises to produce… a digital omniscience… a pervasive everywhere augmented reality environment… that can be intuitively browsed” just as web browsers opened up the data contained on the internet.
Again, this sense of scientific challenge cannot afford to ignore the business ramifications:
For all their brilliance, these creative scientists appear to be unaware of the restless economic order eager to commandeer their achievements under the flag of surveillance revenues.
That is my fear also: yes, there might be things I am curious to know about my students, and things it might even be useful for me to know, but not at the risk of empowering data-gathering processes and markets that extend far beyond my classroom, real or virtual.

4. Surveillance Capitalism’s Realpolitik

In this chapter, Zuboff shifts from that sense of scientific curiosity into the real business projects based on converting reality into a data stream, with a focus on IBM’s $3 billion investment in the “internet of things,” a project led by Harriet Green. For these projects to succeed, there cannot be "dark data," data that is out of reach:
Because the apparatus of connected things is intended to be everything, any behavior of human or thing absent from this push for universal inclusion is dark: menacing, untamed, rebellious, rogue, out of control. [...] The tension is that no thing counts until it is rendered as behavior, translated into electronic data flows, and channeled into the light as observable data. Everything must be illuminated for counting and herding. [quoting Harriet Green] “You know the amount of data being created on a daily basis—much of which will go to waste unless it is utilized. This so-called dark data represents a phenomenal opportunity… the ability to use sensors for everything in the world to basically be a computer, whether it’s your contact lens, your hospital bed, or a railway track.”
At the same time that ed-tech seeks to gather all the data of a student's life, they are also de-contextualizing that data, rendering everything as behavior, objectifying everything and everyone:
Each rendered bit is liberated from its life in the social, no longer inconveniently encumbered by moral reasoning, politics, social norms, rights, values, relationships, feelings, contexts, and situations. In the flatness of this flow, data are data, and behavior is behavior. [...] All things animate and inanimate share the same existential status in this blended confection, each reborn as an objective and measurable, indexable, browsable, searchable “it.” [...] His washing machine, her car’s accelerator, and your intestinal flora are collapsed into a single dimension of equivalency as information assets that can be disaggregated, reconstituted, indexed, browsed, manipulated, analyzed, reaggregated, predicted, productized, bought, and sold: anywhere, anytime.
It used to be that student "surveillance" consisted of teachers taking attendance and giving tests. The world of ed-tech surveillance has changed that into something profoundly different, and profoundly alienating for both students and teachers. Our classroom is not our classroom any longer.

5. Certainty for Profit

This section focuses on the way that predictive products fundamentally change the nature of a business like insurance, which is no longer about communities and shared risk, but individualization based on data analytics and predictive algorithms. Does anybody know of a good write-up on how the same process could undermine education? Traditionally, education was a community project, but it seems to me that, by analogy, the predictive analytics that are fundamentally changing the insurance business will change the education business in the same way.

Here are some of Zuboff's comments about telematics in the auto insurance world:
This leads to demutualization and a focus on predicting and managing individual risks rather than communities. [...] Telematics are not intended merely to know but also to do (economies of action). They are hammers; they are muscular; they enforce. Behavioral underwriting promises to reduce risk through machine processes designed to modify behavior in the direction of maximum profitability. [...] Telematics announce a new day of behavioral control.
Another ominous education parallel is the use of gamification (think ClassDojo); when people push back on these metrics as an invasion of privacy, the insurance companies respond with fun gamification:
If price inducements don’t work, insurers are counseled to present behavioral monitoring as “fun,” “interactive,” “competitive,” and “gratifying,” rewarding drivers for improvements on their past record and “relative to the broader policy holder pool.” [...] In this approach, known as “gamification,” drivers can be engaged to participate in “performance based contests” and “incentive based challenges.”
Of course, gamification does not have to work this way... but it can. And for how that is playing out in education, see Ben Williamson on ClassDojo here: Killer Apps for the Classroom? Developing Critical Perspectives on ClassDojo and the ‘Ed-tech’ Industry

6. Executing the Uncontract

In this chapter, Zuboff discusses how what is today the stuff of marketing hype was stuff that would once have been considered a dystopian nightmare. 
Yet now that same nightmare is rendered as an enthusiastic progress report on surveillance capitalism’s latest triumphs. [...] How has the nightmare become banal? Where is our sense of astonishment and outrage?
To answer this question, Zuboff proposes the idea of an uncontract, which has rendered us as passive agents:
The uncontract is not a space of contractual relations but rather a unilateral execution that makes those relations unnecessary. The uncontract desocializes the contract, manufacturing certainty through the substitution of automated procedures for promises, dialogue, shared meaning, problem solving, dispute resolution, and trust: the expressions of solidarity and human agency that have been gradually institutionalized in the notion of “contract” over the course of millennia. [...] The uncontract bypasses all that social work in favor of compulsion.
What Zuboff calls the "substitution of machine work for social work" is an enormous threat in education today, with the most vulnerable populations to most likely to have their agency institutionalized.

7. Inevitabilism

The nightmare has not just become normalized; it has become inevitable.
Among high-tech leaders, within the specialist literature, and among expert professionals there appears to be universal agreement on the idea that everything will be connected, knowable, and actionable in the near future: ubiquity and its consequences in total information are an article of faith. [...] Paradiso’s conception of a “digital omniscience” is taken for granted, with little discussion of politics, power, markets, or governments. As in most accounts of the apparatus, questions of individual autonomy, moral reasoning, social norms and values, privacy, decision rights, politics, and law take the form of afterthoughts and genuflections that can be solved with the correct protocols or addressed with still more technology solutions.
Are data analytics inevitable? The folks at Instructure think so (Instructure CEO Dan Goldsmith: "So when you think about adaptive and personalized learning I think it's inevitable"), but Zuboff reminds us about the three essential questions we must ask, questions whose answers are not inevitable.
What if I don’t want my life streaming through your senses? Who knows? Who decides? Who decides who decides?
There then follows one of the most interesting parts of this chapter: Zuboff talked to Silicon Valley engineers to find out what they thought about inevitabilism. Answer: these insiders at the heart of the "inevitability" know better.
Nearly every interviewee regarded inevitability rhetoric as a Trojan horse for powerful economic imperatives.
Here is a quote from one of those interviewees:
“There’s all that dumb real estate out there and we’ve got to turn it into revenue. The ‘internet of things’ is all push, not pull. Most consumers do not feel a need for these devices. You can say ‘exponential’ and ‘inevitable’ as much as you want. The bottom line is that the Valley has decided that this has to be the next big thing so that firms here can grow.”
Push, not pull: that to me is very much what is happening with analytics in the LMS. And when you push back and say you do not want them, lo and behold, you cannot turn them off. I just want the ability to opt out, but I am growing less and less hopeful about that. And I still can't turn off the (wrong) Canvas Gradebook labeling of my students.

8. Men Made It

The title of this subchapter is from Steinbeck's Grapes of Wrath where it is the banking system, made by men but beyond our control: The bank is something more than men, I tell you. It's the monster. Men made it, but they can't control it.


Thus the bitter paradox of using our agency to build systems that deprive us of agency:
Every doctrine of inevitability carries a weaponized virus of moral nihilism programmed to target human agency and delete resistance and creativity from the text of human possibility.
Zuboff insists it does not have to be this way; it is NOT inevitable.
We know that there can be alternative paths to a robust information capitalism that produces genuine solutions for a third modernity. [...] Inevitabilism precludes choice and voluntary participation. It leaves no room for human will as the author of the future. [...] Will inevitabilism’s utopian declarations summon new forms of coercion designed to quiet restless populations unable to quell their hankering for a future of their choice?
And I return again and again to the most distinctive feature in Canvas LMS: there is no choice. You cannot build a course in Canvas predicated on the idea that students will choose to do things, or not to do things. The learning management system turns student agency into compliance.

And data collection.

9. To the Ground Campaign

The final chapter is about Google's Sidewalk Labs and the creation of "Google Cities." Mutatis mutandis, you can see the same thing happening to universities as they allow themselves to be rendered as data. Ironically, Sidewalk Labs presents itself as a way to combat digital inequality, just as some proponents of learning analytics insist that they, too, want to help students:
Sidewalk Labs’ first public undertaking was the installation of several hundred free internet-enabled kiosks in New York City, ostensibly to combat the problem of “digital inequality.” [...] Sidewalk’s data flows combine public and private assets for sale in dynamic, real-time virtual markets that extract maximum fees from citizens and leave municipal governments dependent upon Sidewalk’s proprietary information.
So, yes, this is what I thought about at Can*Innovate: while people cheer on the ability to track student views of LMS Pages, the real discussion are happening offstage:
The realpolitik of commercial surveillance operations is concealed offstage while the chorus of actors singing and dancing under the spotlights holds our attention and sometimes even our enthusiasm. [...quoting Google's Eric Schmidt] “The genesis of the thinking for Sidewalk Labs came from Google’s founders getting excited thinking of ‘all the things you could do if someone would just give us a city and put us in charge.’”
Do we really want to put the LMS more and more in charge of the education we deliver online? I certainly do not, which is why I am still (still...) hoping for the ability to opt out of Instructure's use of data from my courses for its machine learning experiments and the development of its predictive algorithms.

I don't want to predict my students' futures. I want them to choose their futures, and I will do my best to then help them get there.