Pages

June 22, 2019

Data Mongering (10): Don't fence me in!

This is my tenth round-up; you can see them all here: Data-Mongering Round-Ups. I've been out of town most of this week and not on Twitter much, but there's still plenty to report on... of course.

I'll start out with something lovely and uplifting: Maren Deepwell shared both the slides and the text of her ETUG keynote from Kamloops this week: You are more than a data point. Weaving a tapestry of hope and humanity. The presentation covers much more than just data, but she has some great remarks about data and its dangers as you can guess already from the title. quote: "As a woman I learn each and every day about the limits of technology and the reality of privilege, prejudice and power. Whether it’s inherent algorithmic biases, gender data gaps in the datasets used to train AIs or mobile phones that aren’t designed to fit in my ‘smaller than the average man’s’ palms, all of these examples and countless others highlight how important it is to question how technology works, to interrogate what’s behind the dashboards and predictive models, the disruptive technology that is hailed as the next big thing."


And here is a project, both hopeful and useful, that I learned about this week: tosdr.org offers annotated terms-of-service. You can follow them at Twitter also: @tosdr.


And for something less hopeful, an item from IHE: GPS to Track Student Attendance. This Cal Poly San Luis Obispo professor requires his students to check in using an app he created which accesses their phone GPS data: quote "Once students enter this radius, a geofence, they push a button on the app noting that they’ve arrived for class." Geofencing, from the app's website:


Instead of gathering data on his students (and, I suppose, docking their grade based on attendance?), it seems to me this professor could instead be asking his students why they do, or don't, show up for class. Geofencing and GPS data are not telling him what he needs to know in order to improve the class, but getting feedback about the class, both from students who attend and those who choose not to attend, could actually be helpful. And for a great piece on feedback from students, see this piece in Edutopia: Can 360-Degree Feedback Empower Students and Teachers? quote "perhaps the most important effect of this collaboration is the relationship building and personal understanding between teachers and students. Those strengthened bonds offer a stronger foundation of cultural sensitivity across the community" ⁠— the complete opposite of the way that surveillance technology undermines mutual trust between students and teachers.

This next piece comes from social work, but there is lots here for educators to ponder: Stuck On Algorithms by Sean Erreger. He notes the importance of the right to contest algorithm errors: quote "Also important to social workers should be the Right To Contest. That if one of these common blindspots are found, there is means to reconcile this. Is there enough transparency in the algorithm to fix “the problem. This is important when thinking about empowering the individuals and families we serve."

So too for students we serve, and also for ourselves if, indeed, our schools are going to start evaluating our work by surveilling us and using analytics. On that subject, here's an IHE piece  a couple years ago: Refusing to Be Evaluated by a Formula, and more from David Hughes of Rutgers here: Academic Analytics: Action Requested.

Meanwhile, you can find out more about "right to contest" and other AI pitalls in this great graphic from MIT Media Lab: AI Blindspot.



Also not about education directly, but with profound (and frightening) implications for education is this ACLU report: The Dawn of Robot Surveillance: AI, Video Analytics, and Privacy (download report from the link) quote "Analyzing video is going to become just as cheap as collecting it. While no company or government agency will hire the armies of expensive and distractible humans that would be required to monitor all the video now being collected, AI agents — which are cheap and scalable — will be available to perform the same tasks. And that will usher in something entirely new in the history of humanity: a society where everyone’s public movements and behavior are subject to constant and comprehensive evaluation and judgment by agents of authority — in short, a society where everyone is watched."


In particular, this report shows why we need to hear from LMS companies about limits to the data they will collect, limits to the data they will keep, and limits to the ways they will use that data. We cannot let those limits be (re)defined by the ever cheaper technology of surveillance and analysis; just because they can afford to gather and analyze the data does not mean that they should. See, for example, the gung-ho big data argument by Vince Kellen at Educause, 21st-Century Analytics: New Technologies and New Rules, insisting that cheap technology in and of itself justifies collecting all the data: quote "We try to bring in all the data that we can find in any given stream, whether we think we will use the data or not." I disagree; just because the data can be collected does not mean that it should be collected! And on the need for setting those limits, a hopeful counterpoint from New York state: Legislation to suspend facial recognition in schools passes state Assembly

Finally, on the unintended consequences of too much data, I learned a new word from this article: orthosomnia, which is perfectionism about sleep induced by sleep-tracking apps: That Sleep Tracker Could Make Your Insomnia Worse by Karen Zraick and Sarah Mervosh (NYTimes). quote "Sleep specialists caution that these apps and devices may provide inaccurate data and can even exacerbate symptoms of insomnia. Fiddling with your phone in bed, after all, is bad sleep hygiene. And for some, worrying about sleep goals can make bedtime anxiety even worse. There’s a name for an unhealthy obsession with achieving perfect sleep: orthosomnia."

Perfectionism is already a huge problem in education; we don't need to feed that problem with big data, especially superficial and inaccurate data.

And for a closing graphic this week, here's a reminder about Maha Bali's event tomorrow, Monday, June 24: The other side of  student empowerment in a digital world #FOEcast. I'll be traveling on Monday, but there's lots to explore there in the blog post too; see the post for links and lots of read and ponder.