Pages

May 5, 2019

Data Mongering (4): Canvas, Capitalism, Nudges and Smacks

This is my fourth of these round-ups; you can see them all here: Data-Mongering Round-Ups. I'm also annotating as I go in Diigo, so you can the Diigo files here: Data-Mongering articles. The editorial comments there are just copied-and-pasted from the blog posts. I'll be out of town this coming week with only intermittent Internet access... so there won't be a post next weekend, but I'll be back on May 19 with more datamongering to report. The fun never stops in the world of Big Data. :-)

The most important discovery I made this week was an article in the University of British Columbia's student newspaper calling out Instructure's data takeover: Canvas is tracking your data. What is UBC doing with it? by Zak Vescera. I would love to see some reporting like this by our student newspaper at OU! UBC recently adopted Canvas, and there is journalism student there, Bryan Short, who is expressing concerns (which I share) about the way local student data is now part of Instructure's global operations and product development, quote: "But there’s technically little stopping Instructure or Amazon from using the de-aggregated data to improve their own services — and that’s what has Short worried. “This is stuff that’s private personal information that you’re sharing in the context of your education of your university, but this information is now accessible to a company in the United States,” argued Short.


Platform Capitalism and the Governance of Knowledge Infrastructure by Leslie Chan. Powerful stuff here; I'll let the abstract speak for itself; quote: "The dominant academic publishers are busy positioning themselves to monetize not only on content, but increasingly on data analytics and predictive products on research assessment and funding trends. Their growing investment and control over the entire knowledge production workflow, from article submissions, to metrics to reputation management and global rankings means that researchers and their institutions are increasingly locked into the publishers’ “value chain”. I will discuss some of the implications of this growing form of “surveillance capitalism” in the higher education sector and what it means in terms of the autonomy of the researchers and the academy. The intent is to call attention to the need to support community-governed infrastructure and to rethink our understanding of “openness” in terms of consent and social values."


Code Acts in Education: Learning from Surveillance Capitalism by Ben Williamson. These are his thoughts on Zuboff's book. I hope to finish the book on my travels this week, so I will have some thoughts of my own to share. Williamson has three main takeaways from this book for education, and I am most interested in this one: quote "3) Programmable policies. A third line of inquiry would be into the idea of ‘policies’. Education policy studies have long engaged critically with the ways government policies circumscribe ‘correct’ forms of educational activity, progress, and behaviour. With the advance of AI-based technologies into schools and universities, policy researchers may need to start interrogating the policies encoded in the software as well as the policies inscribed in government texts. These new programmable policies potentially have a much more direct influence on  ‘correct’ behaviours and maximum outcomes by instrumenting and orchestrating activities, tasks and behaviours in educational institutions." For related resources, check out the Twitter thread.

Predictive Analytics: Nudging, Shoving, and Smacking Behaviors in Higher Education by Kevin Desouza and Kendra Smith. This is a piece from 2016, but I just learned about it now. I definitely use "nudges" in my classes, lots of them in fact, but they are not automated; each one depends on what I know about the individual student (Small Data and Microassignments). I am not interested in automating that process of nudging, and as for shoves and smacks, no, thank you. And that's even if everybody's intentions are pure, as the authors are willing to suppose, quote: "The answer is not simple. Perhaps the deepest concern lies in the definition of the problem and in who decides the direction of nudges. Nudging can easily become shoving or smacking. Obviously, the intentions behind most higher education practices are pure, but with new technologies, we need to know more about the intentions and remain vigilant so that the resulting practices don’t become abusive. The unintended consequences of automating, depersonalizing, and behavioral exploitation are real. We must think critically about what is most important: the means or the end."

CWiC Courseware in Context. This is a framework for evaluating courseware that I learned about this week (details: Student Agency: The Latest Casualty in the Marcomm War for Control of Online Learning) so I thought I would comment on that here. The framework does contain some basic privacy criteria (FERPA compliance certification; US / EU Safe Harbor certification; Ability to ensure that data will not reside in foreign data centers), but there is no discussion of a user opt-out, which is the key criterion I am looking for — and I am still hoping for some kind of opt-out in Instructure's new turn towards predictive products.

Maybe Universities Shouldn't Be Putting Amazon Echos in Student Dorms by Eric Stoller. I remember being appalled that Instructure was promoting an Alexa "skills" Canvas interface ("Alexa, tell me what my grade is" etc.), and there are schools who have jumped on the Alexa bandwagon, which is to say, Alexa surveillance. Even Eric Stoller now has some doubts: "Yes, I'm still an advocate for experimentation with technologies that can enhance the student experience. However, there's a cautiousness that's been creeping into my consciousness. When it was disclosed that Amazon employs thousands of people who listen to what you say (directly or indirectly) to Alexa, warning bells went off inside my head. The reason for this invasive monitoring, according to Amazon, is to make the product better for users. My guess is that an always-on microphone-laden device serves as an excellent surveillance instrument to feed the Amazon marketing/data machine."

And while it is not on the subject of datamongering directly, this great piece from Jess Mitchell, Age of Awareness - The Damage We Do: Assessment, gives us a lot to think about when we ponder all that data and what people are going to do with it. Quote: "When we are confronted with complexity and uncertainty, we lean back on simplicity and completeness and sameness. We approach the world with a transactional check-book accounting expectation — if we document our inputs and outputs, it should all reconcile neatly in the end. Tell me how I’ll be measured and only then will I know how to perform. Tell me the desired outputs and I’ll manhandle the inputs to make them conform."

With thanks to Bonni Stachowiak for the tweet, which is how I learned about the article: