Pages

Saturday, March 23, 2019

Audrey Watters and Getting Back into the Groove

So, I'm back home after a very long week in Austin (but a good week: more about my dad and his xylophone here)... and I did have the luxury of lots of reading time while traveling, especially on the way home with a long layover in DFW. Outbound, I read Cathy O'Neil's Weapons of Math Destruction, which was fantastic (more on that later), and then on the way home I found the perfect antidote to my blogging blues: I read all four volumes of Audrey Watters's MonstersThe Monsters of Education TechnologyThe Revenge of the Monsters of Education TechnologyThe Curse of the Monsters of Education Technology, and The Monsters of Education Technology 4. And WOW: that was exactly what I needed to read.


For those of you who haven't seen this series, for the past several years Audrey has curated and published a collection of her keynotes, covering a huge range of issues related to both education and technology. She also publishes the keynotes at her blog (with all the graphics too), but it's a very different experience to read them all together like this, especially all the books together, seeing her main themes develop in different ways while new themes also come into focus.

And a big dose of Audrey that was exactly what I needed to get back into the groove of blogging here. I spent the past two years doing my best to make a useful contribution to the Canvas Community, publishing a couple hundred blog posts there, along with developing various open Canvas resource-courses. I was persuaded to do that because Canvas, unlike D2L and Blackboard, has an open architecture with real URLs that allow people to open up their courses so that others can look and link (although, sadly, almost nobody does that; you can see my own courses at Myth.MythFolklore.net and India.MythFolklore.net) and Canvas also makes it possible to use web technologies like RSS and javascript and iframe to bring external content into the Canvas system (so I built a Canvas Widget Warehouse full of javascripts for anyone to use, plus tutorials Twitter4Canvas, etc.).

Writing there at the Community, I really felt like I had a contribution to make, bringing what I had learned about open education practices to the community of people using Canvas (often required by their schools to use Canvas and nothing-but-Canvas), sharing my Canvas hacks and also learning from others at the Community. It's telling, I think, that Instructure does not use Canvas for this learning community because Canvas has no good community features, which has always been one of my complaints: even though Canvas does have some features that make it different from D2L and Blackboard, it is fundamentally a quizzing-and-grading system, and Instructure has devoted the lion's share of development resources over the past several years to the quizzes and gradebook. As a space for social learning, Canvas is terrible, and one of my big blogging projects at the Community was documenting how I use a blog network both for all the work my students create and share, and also for all the content that I develop. I've always said it was a big mistake for instructors to try to rely on the content development tools inside Canvas (the content tools are incredibly limited), and there are no real content development tools for students at all in Canvas, just the very tired, very old discussion board.

Despite the fact that I had made these criticisms of Canvas openly and often during my two years participating at the Community, my contributions were still welcome, and Instructure even brought me to the InstructureCon conference last summer as a kind of community booster, someone trying to help people connect with the network of learners using the Canvas Community space. But then, ten days ago, when I criticized the recent announcement by the new CEO Dan Goldsmith about the advent of AI, machine learning, and algorithms, the Community Managers decided that I had crossed a line, violating the rule that everything we post at the Community must be "uplifting" ... or else. That's why I'm now blogging here again after a long hiatus, and I posted a much more frank post about Goldsmith's vision for Instructure here at this blog: My Soylent-Green Moment: Instructure and the Future of Canvas.

So, as I was reading Audrey's books yesterday, this statement really jumped out at me — well, LOTS of statements jumped out at me, but this is what helped me frame what had happened to me at the Canvas Community and their rule of always-uplifting-all-the-time (bold emphasis is mine):
These are rarely topics addressed at education technology conferences; even the topics that might be most directly pertinent to the field – information security, institutional sustainability, for example – are brushed aside, in part I would argue, because education technology has worked quite hard to repress the trauma and anxiety associated with the adoption of new technologies and more broadly with the conditions, the precarity, of everyday life. Education technology has become so bound up in arguments about the necessity of technology, that it’s forgotten how to be anything other than the most loyal of servants to ideologies of machines, efficiencies, capitalism. It’s always sunny in education technology. Its tools are golden and transformative; its transformation, its disruption is always profitable and progressive. Better. Faster. Cheaper. Shinier.
You can read more at Audrey's blog; the quote is from this piece: Re·Con·Figures: The Pigeons of Ed-tech. And I highly recommend that you do read the whole thing; I think this is one of her very best pieces. 

So, I did my best to be sunny at the Canvas Community, contributing to the uplift, but at a certain point, there's just too much trade-off, too much compromise, too much important stuff that gets swept under the rug or shoved into the closet. It's time to talk about the trauma and anxiety, facing it honestly and figuring out what we can do to fight back. 

Stay tuned. :-)




Thursday, March 21, 2019

Digital Resources and Analog Music

I haven't been able to blog this week because it's our Spring Break and I'm at my dad's in Austin -- yes, we are now in Year Two of what was supposed to be just a few weeks of hospice care for lung cancer (he was diagnosed in October 2017 and went on hospice care in January 2018), and the hospice miracle continues. I come to Austin every month, and something really cool happened this visit which I wanted to write about here: my dad is learning to play the xylophone! At age 91!


Even though he loves music, he never learned anything about music in school or as a hobby. He can't/won't sing, and every time I suggested to him getting some kind of instrument to play, he was completely negative about it. "I have no musical talent," he insisted.

Well, of course I know that this whole "musical talent" thing is just a fixed mindset myth: every human being is born ready to be musical, but sadly that musical impulse needs support and encouragement, and it can easily be squashed by unkind words or shaming, or simply by neglect.

What happened this week was that the hospice chaplain brought her mandolin to play some music for him, and he was fascinated by watching her play. And what a beautiful instrument! I had never seen a mandolin up close before myself. One of the songs she played was John Denver's Country Roads, and that's a special favorite for my dad.

So, even though my dad insisted that he didn't want an instrument and said that he was disgusted by the whole idea, I made him an executive decision ordered him a xylophone anyway! And................


.............. it worked! He is absolutely delighted by the thing and is already learning two songs: Mary Had a Little Lamb (which is all CDE and G) and Row Row Row Your Boat (which is CDE and F and G). We can't get him to sing along while he plays, which makes it harder, but no matter: there's no wrong way to get started, and there is no wrong way to continue... because he's MAKING music.

His xylophone is a very simple 13-note set of keys with no sharps and flats, and I also ordered a xylophone for his main caregiver (who loves music and sings beautifully) which has a full two octaves, plus sharps and flats.


And it was so cool to see how he is really curious about her xylophone too: he can see that it is similar to his because it has the same white keys that were the same letters (plus a few more), and then it also has all those black keys. This morning he played around on that just listening to the sounds to hear what he could hear. And he accidentally started playing Frere Jacques on his own, and he recognized that's what he was playing! How cool is that???!

Plus, he could also see from her xylophone that this is like playing the piano, but without any finger work: the black and white keys are indeed like keys on a piano. Of course I told him that if he wanted to learn to play a piano keyboard later, we could get him one of those too, and then he could play any melody he wants.

One of the joys of the xylophone, though, is that it is not electric at all. This is good old analog music, and that analog experience means you hear changes in the the resonance of the key based on how you hit it, how you hold the mallet, etc. When you get a good hit, the sound really is lovely, like wind chimes.

And, as promised in the title of this post, there is a digital part to this analog music story too. My dad's xylophone is meant for kids and it came with some simple children's songs written out with the lyrics plus the letter notes, easy to read and follow along (you can see the music cards in the picture above).

But there's no reason he should be limited to children's songs! These thirteen notes should be good for all kinds of music, right? So I went looking online for a source of music for beginners without any musical notation because I would never dream of trying to teach him to read sheet music. I wanted just the lyrics and the letter notes, and I found this wonderful website: NoobNotes.net. Here's how it presents a song, just the letters, with a little dot to indicate the lower octave and a ^ caret for the higher octave:


Not only does it have a wide selection of all kinds of songs, but it also has a feature that allows you to transpose, which meant I was able to go through the songs, adjusting to see if I could find something with a range between low G and high E, and with no sharps or flats, which would mean a tune that he could play on the little xylophone. When I found one that would work, I printed it out to add to our music collection.


What a treasure trove! I found some of his favorite songs like Always on My Mind, Can't Help Falling in Love with You, Love Me Tender, and Wonderful World, plus many more. All conjured up with just 13 notes! In total, I found 23 songs at the site that are favorites of his and which fell in that 13-note range. Including... John Denver's Country Roads. So I alerted the chaplain that if she didn't mind playing Country Roads in his key, they could play together.

And that is the power of digital, letting me search for songs and transpose, with a great visual presentation that immediately alerted me to the presence of sharps and flats. Digital search in the service of analog experience!

So, I am really happy with how this trip has gone: I thought my main task this time was going to be to do my dad's taxes. And yes, I did the taxes (ugh), but this musical breakthrough has turned out to be the real success story. My dad has been given an incredible chance to really put things in his life right over the past year, and I am so glad that making his own music is a part of that miracle story.

Here is one of my favorite songs that you can play with just those 13 notes: Morning Has Broken.


Make music, people!


Saturday, March 16, 2019

My Soylent-Green Moment: Instructure and the Future of Canvas

This past week ranks as one of the worst weeks of my professional life: I learned that Instructure is going to be using (is already using?) the data collected about students in Canvas for machine learning and algorithms. I'm still completely shocked. If you haven't read the statements by Instucture CEO Dan Goldsmith in this report by Phil Hill, here is the article:
Instructure: Plans to expand beyond Canvas LMS into machine learning and AI

It's a kind of "Soylent Green" moment for me, realizing that a company and a product in which I had put a lot of faith and trust is going to be pursuing an agenda which I cannot endorse and in which I will not participate.



In this blog post, I'll explain my understanding of the situation, and then close with three main concerns that I have. There will be many more posts to come, and I hope those who know more than I do about machine learning in education will chime in and help me further my own education about this grim topic.

The Now: Canvas Data for Classes and Schools

I've not been impressed by the current Instructure data analytics since their approach is based only on surface behaviors, with no attempt to ask students the "why" for those behaviors (for example, short time spent on content page: because the student is bored? because they are confused? because it was the wrong page? because they have limited time available? because they got distracted by something else? etc.). Yes, Instructure collects a lot of data from students (all those eyeballs! all those clicks!), but just because they have a lot of data does not make it meaningful or useful. Speaking for myself, I get no benefit of any kind from the "Analytics" page for each student in my class that the Canvas LMS wants to show me:



I know that some schools also use the data from Canvas on an institutional level, but that's not something I know a lot about, and I also know there are commercial products, like Dropout Detective, that help schools extend their use of the data in Canvas. Just how a school tracks and uses the data it gathers about its students is for each school to decide.

At my school, for example, there is a strong presumption of student privacy when it comes to enrollment and grading data, as you would expect from FERPA. As an instructor, I use my students' ID numbers to report the students' grades (I am required to do that at the end of the semester, and I am urged to report midsemester grades, but not required), and that is all I can do. I cannot find out what other courses a student is enrolled in or has enrolled in, nor can I find out a student's grades or GPA.

And that is how it should be: it is not my business. Yes, that data exists. And yes, in some cases that data might also be helpful to me in working with a student. But just because the data exists and might be helpful does not mean that I can use it. The student starts with a fundamental right to privacy about their enrollment and grades, and it is up to the school to make decisions about how that data is shared beyond the classroom, like when advisors are able to look at a student's courses and grades overall, or aggregate analysis, like the way the university publicly reports on the aggregate GPA of student athletes, for example.

The Future: Instructure Robo-Tutor in the Sky

So, while my students' performance in their other classes is not my business, Instructure has decided to make it their business. In fact, they have decided to make it the future of their business. Goldsmith is emphatic: the Instructure database is no longer about data reports shared with instructors and with schools. Instead, it is about AI and machine learning. Instructure is going to be using my students' data (my students, your students, all the students) in order to teach its machine to predict what students will do, and then the system will act on those predictions. Quoting Instructure CEO Dan Goldsmith (from Phil's article, and yes, if they do have "the most comprehensive database on the educational experience in the globe," well, that's because we gave them all our data):


Welcome to your worst education nightmare: they are going to lump together all the data across all the schools, predict outcomes, and modify our behavior accordingly... thus sayeth Dan Goldsmith:


In future posts, I'll write in more detail about why this is bound to fail. The hubris here is really alarming; it's as if the executive team at Instructure learned nothing from the costly failures of other edtech machine-learning solutionists during the late, not-great era of the MOOCs. Back in February 2019, Michael Feldstein had speculated that this kind of hype might be subsiding (Is Ed Tech Hype in Remission?), but here we are just a few weeks later, and the hype is strong. Very strong.

Three Concerns

For now, I have three concerns I want to focus on:

1. What exactly did I agree to? To my shame, I put a lot of trust in Instructure, so it is indeed true that I clicked a checkbox somewhere at some point without reading the privacy policy and related legal policies. My students clicked such a checkbox too. At the Instructure website there is a Privacy Policy that relates to personal identifying information (you can access that from the Canvas Dashboard), and once you get to the Instructure website, you can also find an Acceptable Use Policy, but it seems primarily focused on indemnifying Instructure from wrongdoing by users (illegal content, objectionable content, etc.). I'm not a lawyer, but I guess it all hinges on this: "Instructure reserves all rights not granted in the AUP Guidelines." That sounds like they can use all the non-personally-identifying data as delimited in the separate privacy policy in any way they want, is that right?

They do state that they "respect the intellectual property of others and ask that you do too," but it's not clear at all if they regard all the content we create inside the system (assignments submitted, quizzes created and taken, discussion board posts, etc.) as our intellectual property that they should respect and not exploit without our permission. Hopefully someone who knows more than me can figure out how this AUP compares to the kind of terms-of-service that are being used by a company like, say, Coursera, which from the start was committed to machine learning and exploitation of user content in the system.

I don't know what the Coursera terms-of-service looks like now, but back when they first got started, they were very explicit about reusing our content to build their machine-learning system, as I wrote about when I took a first-generation Coursera course back in 2012: Coursera TOS: All your essay are belong to us. See that blog post for language like this: "you grant Coursera and the Participating Institutions a fully transferable, worldwide, perpetual, royalty-free and non-exclusive license to use, distribute, sublicense, reproduce, modify, adapt, publicly perform and publicly display such User Content," etc. I didn't see that kind of language in the Instructure policies, but I'm honestly not sure where to look.

Instructure does have a "Privacy Portal" with a cutesy graphic (visit the page to see the curtain being drawn and clouds of steam arising from behind the shower curtain). I thought the text in bold beside the graphic would be links leading to more information, but they are not links. There's a privacy policy, an acceptable use policy, and a data processing policy linked across the top of the page, but I don't see something labeled "terms of service" like what Coursera had in place. The shower curtain is labeled "privacy shield." Yeah, right.


2. What about opting out? Without an opt-out, Instructure is putting us in an impossible situation, way worse than with TurnItIn, for example. If a student insists that they will not use TurnItIn (as I think every student should do: just say no!), then it's easy to find work-arounds; teachers would just have to read the student's work for themselves without robo-assistance. But if a student says, no, they will not use Canvas because they do not want their data to be exploited for corporate profit, then that puts the teacher in a really awkward position. If you put all your content and course activities and assessments inside Canvas and a student does not want Instructure to use their data, what can the teacher do? It seems to me that Instructure needs, at a minimum, an opt-out for people who do not want their data to be used in this way by our new corporate overlords. Even better: it could all be opt-in, so that instead of assuming students and teachers all want to give their data to Instructure without compensation, you start with the assumption that we do not want to do that, and then Instructure can persuade us to opt in after all.

3. What about FERPA? Right now instructors at my school can put grades in Canvas for institutional reporting purposes (although I actually put mine directly into the SIS instead because the Canvas grading schemes can't accommodate my course design). My school then controls very strictly how that grade data is used, as I explained above. Now, however, it looks like that grade data is something that Instructure is going to be mining, at the course level and at the assessment level, so that its machine-learning engine will track a student's performance both within classes and also from course to course, analyzing their grades and their related data to create the algorithms. To me, that seems like a violation of privacy. In legal terms, perhaps it is not a problem because they are anonymizing the data, but just because it is legal does not make it right. We are apparently giving Instructure extraordinary freedom to take our students' grades and supporting work in order to exploit that not just beyond courses at an institutional level but, as Goldsmith stated (see above), across institutions in ways that will be totally beyond our control. It's like TurnItIn profiting from our students' work (to the tune of 1.7 billion dollars, also in this week's news) without any form of compensation to the students, but way worse. WAY worse. It's not just the students' essays now. It's... everything. Every eyeball. Every click. Teachers and students alike.

Of course, I know Instructure, just like TurnItin, will hire the lawyers they need to make sure they can get away with this. But how sad is that? I never thought I would write a sentence that says "Instructure, just like TurnItIn" ... and yes, I'm angry about it. Angry at Instructure for squandering money, time, and people's trust on what will turn out to be hype rather than reality (but more on that in a separate post). I'm also angry at myself for having put so much trust in Instructure. When I expressed my anger at the Canvas Community this week, I was told that my opinions violated the Community Guidelines which require that everything we post there be "uplifting," so that is why I am back blogging here again after blogging for a couple of years at the Community. I have nothing uplifting to say about the new turn Instructure is taking, and I need a blog space where I am free to say that I am angry about this.

Human Learning

But every cloud (including a SaaS cloud) has a silver lining. I am now going to take my casual layperson's knowledge of machine learning and predictive algorithms in education (mostly gleaned from reading about robograding of student writing) and learn more about that. If the machines are learning, we better get to work on our own learning too! And hey, perfect timing: it's Spring Break and I'll be spending two days in airports. Which means two days of reading.

I'm going to start with Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy by Cathy O'Neil.

Then what should I read next? Let me know here or at Twitter (@OnlineCrsLady).