Pages

June 2, 2019

Data Mongering (7): Say it with a nudge...

This is my seventh round-up; you can see them all here: Data-Mongering Round-Ups. I'm also annotating as I go in Diigo, so you can the Diigo files here: Data-Mongering articles. (The editorial comments there in Diigo are just copied-and-pasted from the blog posts.)

I want to start with two really important items from last week, one at EdSurge and one at EdWeek.


Inside a Student’s Hunt for His Own Learning Data by Sydney Johnson at EdSurge (podcast with transcript). You might remember Zak Vescera's reporting on Canvas at UBC and Bryan Short's story there (Canvas is tracking your data. What is UBC doing with it?), and now here at EdSurge is a detailed interview with Bryan, very much worth reading; Bryan wanted to opt out, but the Canvas system provides no accommodations of any kind, and that is my concern also. If  a student refuses to use TurnItIn (as I think they should!), that's easy to accommodate; the teacher just has to read the paper on their own without machine assistance. But with the LMS, it's hard to come up with a do-it-yourself opt-out. Here's what happened to Bryan: quote "I decided not to opt into Canvas because I was unhappy with the way that my data was being collected. I was unhappy with the way that I had to go about accessing it. I was proposing the creation of a bill of rights around student data, a policy at the university [for using student data]. So, I didn’t opt into the use of it and it really caused quite a bit of tension. It put a huge burden on the instructors who relied upon this technology to conduct their courses because they would have to email me things separately rather than just blasting stuff out to a class. I couldn’t participate in discussions that were taking place online through the learning management system. Ultimately, I think it probably hurt my grades in certain circumstances." ... I hope that Instructure is going to make an effort to develop some opt-out options, a need that is going to become even more urgent as they carry out their own expansion into AI and machine learning, and also as Instructure partners with other ed-tech companies who are in the business of surveilling students. Based on a conversation I had with some folks at Instructure a couple weeks ago, I am hoping for something to perhaps appear at the Instructure and/or Canvas blog about data and opt-outs (fingers crossed).

And to learn more about the student-surveillance business, here is the article from EdWeek: Schools Are Deploying Massive Digital Surveillance Systems. The Results Are Alarming by Benjamin Herold. There are lots of important issues in this article, and in particular I want to highlight the use of sentiment analysis: quote "When Securly launched in 2013, its lone offering was a web filter to block students’ access to obscene and harmful content. A year later, though, Securly also began offering “sentiment analysis” of students’ social media posts, looking for signs they might be victims of cyberbullying or self-harm. In 2016, the company expanded that analysis to students’ school email accounts, monitoring all messages sent over district networks. It also created an “emotionally intelligent” app that sends parents weekly reports and automated push notifications detailing their children’s internet searches and browsing histories." 


Just speaking for myself as a teacher, I am very uncomfortable with the idea of a company mining my students' schoolwork for the purposes of sentiment analysis, and it seems almost inevitable (?) as LMS companies get into the AI/ML business, they are going to start mining the discussion boards for sentiment data to use in their predictive algorithms. This would transform the discussion boards from mere boredom into something far more sinister IMO.

To see what happens when you add facial recognition into the mix, be sure to read Yuije Xue's Camera Above the Classroom is a must-read (I keep sharing this article again and again because it is one of the best I've read, and I really appreciate the inclusion of student voices). And for more on research and tech develop in China, check out UnConstrained College Students and Duke MTMC via the MegaPixels Project. From the Duke MTMC story: quote "Since its publication in 2016, more than twice as many research citations originated in China as in the United States. Among these citations were papers links to the Chinese military and several of the companies known to provide Chinese authorities with the oppressive surveillance technology used to monitor millions of Uighur Muslims." (Thanks to @tanbob for these two items!)

And via @readywriting here's an item from a couple years ago: Hiding from artificial intelligence in the age of total surveillance by Victoria Zavyalova. It's about make-up designed by Grigory Bakunov to thwart facial recognition algorithms, although I suppose in the fast-paced game of cat-and-mouse (where we are the mice, naturally), the cats have probably got a work-around already...?


Meanwhile, on the related topic of OER-mongering: last week, I shared some of the negative reaction to Knowledge Unlatched, and here is another item for that file: The Open Research Library: Centralisation without Openness by Marcel Knöchelmann writing at the London School of Economics Impact blog. And for some thoughts about developments at Lumen, see: The Business of Fettering OER by Billy Meinke-Lau.


But don't despair: here is a really informative and encourage post from Erin Glass at HASTAC: Ten weird tricks for resisting surveillance capitalism in and through the classroom.


And an image for this week is supplied by none other than the Twitter algorithm itself! Someone DMed me this hilarious promoted tweet that showed up in their stream caused (presumably) by my tweet about nudges: it's all about the new behaviorism. In the spirit of B. F. Skinner's conditioned pigeons, along with all the good dogs out there, just "say it with a nudge!" (For more about ed-tech and nudges, see Predictive Analytics: Nudging, Shoving, and Smacking Behaviors in Higher Education by Kevin Desouza and Kendra Smith at Educause.)