Pages

June 9, 2019

Data Mongering (8): Surveilling Students' Social Media

This is my eighth round-up; you can see them all here: Data-Mongering Round-Ups. And despite the usual bad news (see below), today is a good day: after writing this post, I'll be heading down to Durham for Domains19, where surveillance is one of the themes — all kinds of good stuff will be going on! You can see the Schedule here, follow the #Domains19 hashtag, and join in with Virtually Connecting too. The Tuesday VC includes Chris Gilliard and Tim Maughan, both of whom have shown up in previous data-mongering round-ups here at this blog. I am excited about getting to meet them in person!


And now... time for the data-mongering:

An important item this week was the Northumbria-Civitas plan for mental health services based on surveilling students: Northumbria University to lead transformation in how the Higher Education sector identifies mental health issues in students. Their commercial partner in this surveillance project: Civitas Learning. It's all about scale of course: quote  "Dr. Mark Milliron, Chief Learning Officer and Co-Founder of Civitas Learning said: “We help Higher Education institutions make the most of their learning data so that they know what is working for their students and can better personalise and scale student supports." Personalise here means the opposite of what it used to mean: impersonal automation instead of person-to-person care and support. Meanwhile... ka-ching goes the cash register as Civitas will have all that student data to use to build the algorithm products that they can then market to other schools who want to "scale" (automate) student support services.

Coverage also in the Telegraph newspaper: Universities to trawl through students’ social media to look for suicide risk. quote "The university has been running a project for the past two years where a team monitor students’ library use, lecture attendance and academic performance. They use this information to “nudge” students when their engagement drops off. Under the new OfS-backed scheme, the data collected on each student would extend to monitoring social media posts, conversations they have with individual members of staff and information held by their accommodation provider." So, as if the other monitoring were not bad enough, not it will include social media... and on surveilling without student consent, see Adrian Short.


Lots of good commentary at Twitter from Donna Lanclos, among others:


More on student surveillance by Jim Shultz at the New York Times: Spying on Children Won’t Keep Them Safe. quote "I have a 16-year-old daughter, and like every parent in the United States today, I worry about her safety when she’s in school. But here in Western New York’s Lockport City School District, those fears have led to a wasteful and dangerous experiment. This week the district’s eight public schools began testing a system called Aegis, which includes facial recognition technology, that could eventually be used to track and map student movements in our schools. How that happened is a cautionary tale for other schools across the country."

In contrast, here's an article about investing in people, not in surveillance and algorithms: With growing calls for more mental health services, states tackle school counselor caseloads by Linda Jacobson at Education Dive. quote "Research shows California schools are now relying more on counselors in order to improve outcomes for students in areas such as attendance and graduation. A report released last year points to how districts have used the flexibility under a revised funding formula to hire counselors and social workers to serve low-income students, English learners, and foster youth.” In other words: human support, not surveillance and bots.

An item from earlier this year that I just noticed this week: Aiha Nguyen and Alexandra Mateescu writing at Data and Society: Explainer: Algorithmic Management in the Workplace (PDF link). Not directly bout education but obviously very relevant as we see more and more algorithms deployed in education: quote "The authors outline existing research on the ways that algorithmic management is manifesting across various labor industries, shifting workplace power dynamics, and putting workers at a disadvantage. It can enable increased surveillance and control while removing transparency."


And here's a piece about the standardized testing industry and student guinea pigs by Valerie Stauss at the Washington: Post Millions of kids take standardized tests simply to help testing companies make better tests. (Really.) Like all the other humans whose labor is required behind the scenes for the "magic" to work, these students are being made to build the data system, and it's uncompensated labor, of course.

Plus more on that human labor to make the machines go: The AI gig economy is coming for you by Karen Hao at MIT Technology Review. This is an interview with Mary Gray, co-author with Siddharth Suri of Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass. quote "Human workers don’t just label the data that makes AI work. Sometimes humans workers are the artificial intelligence. Behind Facebook’s content-moderating AI are thousands of content moderators; behind Amazon Alexa is a global team of transcribers; and behind Google Duplex are sometimes very human callers mimicking the AI that mimics humans. Artificial intelligence doesn’t run on magic pixie dust. It runs on invisible laborers who train algorithms relentlessly until they’ve automated their own jobs away."

My comment at Twitter:



There continues to be important discussion about Knowledge Unlatched, like this post from Punctum Books by Eileen Joy: The Enclosure of Scholarly Infrastructures, Open Access Books & the Necessity of Community by Eileen Joy ... and here's the latest Elsevier debacle via Colleen Cressman: quote "Elsevier's new authoring platform, Elsa, has deeply troubling terms of service. Whereas most tools that enable user-created content slap a restrictive (to the user and end users) license on the content, Elsevier says users fork over their rights under (C)."


And another screenshot from Colleen that is Elsa-specific:


On the fighting back front, here's something wonderful from Chris Friend and #DHSI19: Balancing Issues of Critical Digital Pedagogy, which contains an Ethics section, include a page on LMS Surveillance. quote "Subverting Surveillance. In critically assessing who de facto benefits from the surveillance in Learning Management Systems and in what ways, while also considering who is thought to benefit from surveillance, we can create architectures that promote a culture of consent by using digital platforms that liberate rather than monitor, surveill and assess."


And for the it's-not-data-it's-marcomm files: What 10,000 Steps Will Really Get You by Amanda Mull at the Atlantic. Not that walking isn't good for you... but 10,000 is a marcomm thing, not a data thing. quote "I-Min Lee, a professor of epidemiology, began looking into the step rule because she was curious about where it came from. “It turns out the original basis for this 10,000-step guideline was really a marketing strategy,” she explains. “In 1965, a Japanese company was selling pedometers, and they gave it a name that, in Japanese, means ‘the 10,000-step meter.’ Lee believes that name was chosen for the product because the character for “10,000” looks sort of like a man walking. As far as she knows, the actual health merits of that number have never been validated by research."


And for more medical marcomm and also data-mongering, check out the write-up about 23andMe in Forbes: Live Long And Prosper: How Anne Wojcicki’s 23andMe Will Mine Its Giant DNA Database For Health And Wealth. Plus a nightmare article from Harvard Business Review: How Bots Will Change the Doctor-Patient Relationship by David A. Asch, Sean Nicholson and Marc L. Berger. And teachers are presumably the bank tellers of education who will be replaced by ATMs.

Finally, for this week's graphic here's a gif from Twitter: you can try to nudge your dog to eat more slowly with an intervention... but the dog is still going to do their own thing! Now when I read about algorithms that nudge people this way or that, I am going to think about this dog. Go, Dog!