Pages

April 28, 2019

Data Mongering (3): Bullshit, Sabotage, and Conscientious Objectors

This is my third of these round-ups; you can see them all here: Data-Mongering Round-Ups. I'm also annotating as I go in Diigo, so you can the Diigo files here: Data-Mongering articles. The editorial comments there are just copied-and-pasted from the blog posts.

Calling Bullshit: Data Reasoning in a Digital World. Fabulous syllabus from Carl T. Bergstrom and Jevin West. Check out Week 7! Week 7. Big data. When does any old algorithm work given enough data, and when is it garbage in, garbage out? Use and abuse of machine learning. Misleading metrics. Goodhart's law.


10 Ways Data Can Sabotage Your Teaching by Terry Heick. I really like the teacher's perspective in this great list from Terry Heick. See the article for examples and insights for each item: 1. The assessments are imprecise. 2. The inferences based on assessment result are limited or erroneous. 3. Assessment is infrequent. 4. The assessment is poorly-timed. 5. Data is dated. 6. ‘Depth of Knowledge’ isn’t factored. 7. Data is not transparent or accessible to others. 8. Data sources are not diverse. 9. Inflexible curriculum that resists data ‘absorption’. 10. There is too much data. 


There has been lots of reporting this week on Brian Goegan's speaking out at Arizona State University about mandated use of courseware, which also means surveillance, as @LibSkrat points out here:


Unethical numbers? A meta-analysis of library impact studies by M. Brooke Robertshaw and A. Asher. The next battleground: libraries. From the paper's abstract: "This paper presents the results of a meta-analysis of learning analytics studies in libraries that examine the effects of library use on measures of student success. Based on the aggregate results, we argue that outcomes of these studies have not produced findings that justify the loss of privacy and risk borne by students. Moreover, we argue that basing high- impact decisions on studies with no, or low, effect sizes, and weak correlation or regression values, has the potential to harm students, particularly those in already vulnerable populations."

EDUCAUSE Horizon Report: 2019 Higher Education Edition. All the hype is here of course: machine learning, AI, predictive analytics, etc. There is the occasional acknowledgment that there might be ethical concerns, but as you would expect, the hype is strong. Very strong. One very useful feature of the article is lots of hyperlinks for future reading.

I Used to Work for Google. I Am a Conscientious Objector. by Jack Poulson (in the NYTimes Privacy Project). This piece expresses exactly why I think it is so important for people to speak out: "The time has passed when tech companies can simply build tools, write algorithms and amass data without regard to who uses the technology and for what purpose."

How much are we sacrificing for automation? by S. A. Applin. This also is not education directly, but there are lots of warnings here for us about educational Taylorism too: Using counting, metrics, and implementation of outcomes from extreme data analysis to inform policies for humans is a threat to our well-being, and results in the stories we are hearing about in the warehouse, and in other areas of our lives, where humans are too often forfeiting their agency to algorithms and machines. And here's more from the article via Donna Lanclos:


And on the subject of mongering, rather than data, check out this post from Dr. Chuck: Why do People Like Sakai, given the Market Share? ... in particular, the PPS: "P.P.S. Instructure spent $135M last year on marketing and sales.  They took this money from the pockets of higher education and used it to convince more schools to give them more money."