Pages

Sunday, March 31, 2019

Samhita Arni's The Prince and Juggernaut Books

Today is a day of two blogs posts; this one is a HAPPY post where I get to talk about a marvelous new book by Samhita Arni and a fascinating publishing platform in India, JuggernautBooks (the other post is an unhappy reflection on student data-mining: AI Overreach). I did the posts in order this way so that I could end on a happy note!

I'll start with some notes about Samhita's wonderful book, and it's one I can recommend to any and all readers (no knowledge of Indian literature required!). Then I'll say a few words about Juggernaut, which is both an ebook platform and  a writing platform for budding authors.


Samhita Arni's The Prince

I first learned about Samhita Arni's books when I was collecting resources for my course in the Epics of Ancient India; here's the author's page about her at my class blog: Featured Author — Samhita Arni. In my school's library, we have her illustrated Mahabharata (Samhita is a wonderful artist as well as being a wonderful writer), and we also have a graphic novel based on the Ramayana for which Samhita wrote the story, with art by Moyna Chitrakar: Sita's Ramayana. Both of those books are excellent ways to start exploring the world of the Indian epics. My favorite of Samhita's books (well, my favorite ... until now) is her novel The Missing Queen which is an ingenious retelling of the Ramayana that upturns everything you thought you knew about that epic, and about the hero Rama in particular. But I don't want to give anything away here. The Missing Queen is available as a Kindle book, which makes it easy for my students to purchase and read; having books available as Kindles has been really important for me because it means students can choose their own reading, rather than me having to order a textbook for the whole class. (More on that below.)

I think I first connected with Samhita at Twitter (@samarni) when Missing Queen came out, and that means I've been following her progress on this new novel, The Prince, over the past few years. I knew she was working on a Tamil epic, something about an anklet. And, to my shame, I really didn't get why she would be doing that; I'd never heard of the Silappadikaram, much less read it... but I figured that if Samhita was working on it, then it would certainly be worthwhile.

And, oh my gosh: IT IS SO WORTHWHILE. I had ordered a copy of the book from a reseller via Amazon since it does not have a U.S. publisher yet (hey, publishers! look at this book!), not knowing that I could simply grab it at Juggernaut (more about that below). So, the book arrived this past Monday:

I began reading it immediately, and finished on Thursday in the waiting room at my dentist. It was one of those oh-my-gosh moments, and since I was reading an actual hard copy I was able to wave it around in the waiting room and tell people how good it was. I mean, it was just SO GOOD: I had to tell people. Right away!

I'm going to write up a proper review to share at Amazon and at the Juggernaut site, and for here I'm just going to indulge in two very personal reactions.

First of all, it was so exciting to read something totally new, where I had no idea at all what was going to happen next. And the plot of the story is so unpredictable — and so un-Hollywood, so un-fairy-tale: be warned that there are grim things that happen, bad things that happen to good people, bad things that good people do, not even realizing what they are doing. Most of the novels from India that I read are remixes of and riffs on traditional epics and mythology. They are full of surprises, to be sure, like Samhita's own Missing Queen, but they do not have that walking-on-the-edge-of-a-cliff sense of the unknown that this novel had for me. Samhita does a fantastic job of bringing each character into focus and fully to life very quickly, and it was never overwhelming to keep track of who was who and what they were doing (don't let the list of character at the front of the book intimidate you). So, that feeling of newness was very exciting. And now I am also excited to read the novel a second time, knowing what is about to happen and being able to frame the characters in that new way.

Second, it was so affirming to read a novel that is not afraid of the world's pain, and which even provides an attempt at a response to that pain, an authentic response to real pain. I'll just include one quote here, out of context to avoid any spoilers; just some wise words to ponder:
We believe we live in a cruel, wretched world, and that moves us to cruel, wretched acts. The world is beautiful. The truth is that the darkness lies within us. You cannot force out darkness, you cannot cut it out or carve it out. How can you cut your own shadow from under your feet? It is impossible. Each one of us must learn to accept the darkness within us, those things we consider too horrible to reveal. And yet, once we learn this, we discover that this same darkness abides in every other soul, hidden away and in secret — and this moves us to an even greater realization — we find our own selves reflected in everyone we meet, we find out own soul in them.
So, don't you want to read the book now? Of course you do! It is BEAUTIFUL.

And............ you can easily snag a copy at Juggernaut Books, either in your browser or using the Juggernaut phone app.

More about Juggernaut

Juggernaut Books (Juggernaut.in) is the publisher of The Prince, and it is available from them as a hardback book; I ordered a copy from a book reseller at Amazon, thinking that I could not use the Juggernaut app in the U.S. The printed book itself is labeled "for sale in the Indian subcontinent only," and I had assumed (wrongly!) that Juggernaut ebooks were not available internationally.

But... I am so glad to have found out I was wrong about that: readers in the U.S. can indeed use Juggernaut ebooks! After reading The Prince I really wanted to tell my students how they could get a copy of this book and read it, so I decided to explore the Juggernaut Books app just to see what would happen. I installed the app on my phone, and, lo and behold, I was indeed able to purchase an ebook copy of The Prince. I used a credit card, and my credit card company pinged me for an extra security code that I had to use (presumably because it was an international purchase?), but it went through just fine, and I had The Prince on my phone. Even better: I then logged on to Juggernaut Books in my browser, and was able to access my account and read the book in my browser, not having to use my phone (with my eyesight, I do better with big fonts on a big screen).

And now... a whole world of delights awaits me at Juggernaut. I am especially excited to find a series of book-shorts by Arshia Sattar that I did not even know about, retellings of episodes from the epics and Indian mythology that would be wonderful reading for my students. Some of her books are available as Kindle books, but there is a whole series of shorts at Juggernaut that I had not seen before:


It's just a few weeks until summer begins for me and, as always, summer means READING... and I am excited about exploring this new treasure trove of books to read at Juggernaut. And honestly, I could not resist, so I bought just one of Arshia Sattar's shorts just now (my credit card required a security code again, so that is one extra step when purchasing at Juggernaut, but it's easy).


For the summer, I think I will need to subscribe to their Readers Club so that I can just read and read. Yay for summer!

Juggernaut... for Writers!

I've been following Juggernaut Books at Twitter for a while now (@JuggernautBooks) because I was interested in the way they are encouraging and recruiting writers, as you can see here at their About Juggernaut page. Their goals are to get more Indians to read books, to encourage more Indians to write, and to make books less intimidating.


Isn't that exciting? So, I am really glad that Samhita's book was my gateway to landing here at Juggernaut where I will explore and learn more. I'm actually thinking I might try to publish something there. For years, I've thought about doing an anthology of Indian and European fairy tales in order to explore the famous "Indian origins" hypothesis about the diffusion of Indian fairy tale motifs throughout Europe. Regardless of where you come down on that debate, reading Indian fairy tales side by side with European fairy tales is a mind-bending adventure, and maybe I could create an anthology like that to publish on the Juggernaut writer platform.


And if you're wondering about the name, yes, Juggernaut is a word that comes to English from India! More about that here: Sanskrit Word in English: Juggernaut.

So, Happy Sunday, everybody! I'm going to go read some more! :-)




AI Overreach in Education: Teachers and Students Must Speak Out

Two weeks ago I wrote here about the comments by Instructure's CEO Dan Goldsmith in which we learned that Instructure is now going to be mining student data and using machine learning to create intrusive algorithms, telling students what and when and how to learn: My Soylent-Green Moment: Instructure and the Future of Canvas. Even if Instructure could make good on that promise, I still think it is wrong for them to take and use our data for that project. Just as importantly, I also think that project is doomed to fail, and that it will be squandering valuable resources in the process.

So, in this blog post, I want to say something about both of those topics: first, I will explain why I have zero expectation that Instructure will be able to build useful algorithms; then, I will share an article I read this week about student data-gathering in China that shows what could happen if teachers and students do not speak out now to protest these moves by Instructure and other ed-tech harvesters of student data.

Canvas and the Assumption of Sameness

Anyone who's used Canvas knows that it is committed to an assumption of sameness. You cannot even modify your course menus without installing an external tool that then "redirects" menu items, even when the "redirect" is going to a page in your own course. Moreover, you have to install a new instance of the redirect tool app from the app store for every change you make. 


I have seven separate instances of the redirect tool installed in my Canvas courses (you can see here: Myth.MythFolklore.net), and no matter how much I want to rename the "Grades" menu item there is nothing I can do in order to change its name. When we used D2L I was able to change the name to "Declarations" (which suits my class better), but Canvas adamantly refuses to let me change the menu labels.

Why must everything be the same, named the same, look the same, etc.? Apparently students will be "confused" if things are not exactly the same, or so I've been told again and again by Canvas enthusiasts. That paternalistic, condescending assumption is something that has always bothered me about Canvas; I think it should be up to students and teachers to make choices about what works and what doesn't. Based on my 20 years of teaching experience, I don't think students are so easily confused as Canvas assumes that they are. Learning how to navigate difference is something students need to do in school, and minimizing difference is not necessarily doing anybody a favor. In any case, both students and teachers need freedom to customize their own learning spaces in the ways they think are best. 

But Canvas is all about minimizing difference. Sure, you can use the redirect tool if you want, and you can use other plug-ins and tools to bring external content into Canvas (that is the method I rely on most), but there is an abiding assumption of sameness in Canvas, and there always has been. It's not a bug; it's a feature.

Now, of course, as Instructure launches its data mining and machine learning efforts, it is going to be all the more important to keep things the same. Not so that the students will not be confused, but so that the computer will not be confused. Because if you are going to try to bring together all the data, across schools and across curriculums as Dan Goldsmith claims, then you really need to make sure things are the same across those schools and curriculums. That's what will allow Instructure to combine the data to create "the most comprehensive database on the educational experience in the globe" (again quoting Goldsmith).

But here's the thing: not all courses have the same design. Not all teachers have the same approach. Not all students learn the same way; they are not white lab rats genetically engineered to respond in the exact same way to the exact same stimuli in the exact same conditions, as a computer does. Nothing in human learning is ever exactly the same.

Canvas, however, wants to make everything the same. Sometimes the results of that sameness are just annoying, like having to add the redirect tool app again and again to a course. At other times, though, the results are dangerous, like when Canvas decided to start labeling my students' work, incorrectly, with big red labels in the Gradebook. LATE said Canvas, and MISSING said Canvas, even though my students' assignments are not late and not missing. Here's a real screenshot from a real student's Gradebook display... and none, not one, of these labels is appropriate:


Canvas started adding these labels as part of the new Gradebook. I documented the problem extensively at the Canvas Community back in Fall 2017, at which point they rolled the feature back, but then, like a good zombie, it returned from the dead again this semester when my school made everyone switch to the new Gradebook. Nightmare. I'm not going to repeat all the details here, but you can see my posts with the #red ink tag documenting the story at the Canvas Community (that last post about the future of Canvas is the one that prompted the Community Managers to prohibit my future blogging at the Community).

Suffice to say, Canvas has no idea how my courses are designed, and they have no right to be labeling my students' work that way in the Gradebook. There is no place for punitive, deficit-driven grading in my classes (more about that here), and that means there is no place for Canvas's red labels. Those labels are just wrong, completely and absolutely wrong. And my students were, understandably, upset when the red ink missing and late labels showed up out of nowhere. I don't mind apologizing to students for mistakes that I make, but I sure don't like apologizing to students for mistakes that the computer software is making. Especially when it's computer software that I am required to use by my school for grading purposes because of FERPA.

This is just one example from one teacher, but I know that we could document thousands upon thousands of examples of ways in which individual teachers and students are doing work that Instructure's "machine" simply cannot understand. And because the machine cannot grasp all the different ways we do things, that means it cannot actually learn from what we are doing. Even worse, it will learn wrongly and do the wrong things, like putting all the wrong labels on my students.

In short, it seems to me that anyone who believes in Goldsmith's claims about "the most comprehensive database on the educational experience in the globe" does not really understand what educational experiences (PLURAL) are all about.

From Bad to Worse: Really Big Data in China

If you have not read this horrifying/fascinating report about student surveillance and data-gathering in China, then you need to do that now; yes, it's long, and it's worth reading from start to finish: Camera Above the Classroom. You can find the author at Twitter: @YujieXuett.


Power of hashtags: I love the way the investigation begins with a hashtag: #ThankGodIGraduatedAlready. A hashtag, and also a government plan:
In July 2017, China’s highest governmental body, the State Council, released an ambitious policy initiative called the Next Generation Artificial Intelligence Development Plan (NGAIDP). The 20,000-word blueprint outlines China’s strategy to become the leading AI power in both research and deployment by 2030 by building a domestic AI industry worth nearly $150 billion. It advocates incorporating AI in virtually all aspects of life, including medicine, law, transportation, environmental protection, and what it calls “intelligent education.”
So, in the great data-arms race to have "the most comprehensive database on the educational experience in the globe," Instructure is really going to have to up their game. Just clicks and eyeballs are not going to be enough. Now they are going to need our faces too.

Much like the MOOC-boosters of yesteryear, this data collection is presented as something that is really for the students, for their benefit, for their own good:
“Do you know the two types of students teachers pay the most attention to?” Zhang asks. “The smartest and the naughtiest.” Hanwang’s CCS technology was born from the desire to care for every kid in the classroom, even the “often-ignored, average students,” he adds.
The desire to care. Uh-huh. "Class Care" is the name of the system, CCS. But is this how students want to be cared for? As the article documents, the students are not happy about this surveillance:
Back in the classroom, my questions about the cameras evoke curiosity among the boys. Jason tells them everything he knows. There is a gasp, followed by silence. “I want to smash it,” one boy says. “Shhh!” Another boy shakes a warning glance at Hanwang’s camera behind us. “What if the camera just captured everything?”
Some students are indeed disabling the cameras in their classrooms, despite their justified fears. Brave students: bravo!

Privacy? The Chinese ed-tech vendor Hanwang is using the same excuse that no doubt Instructure will use: privacy is being respected because data is only being used "in-house" and not shared with a third party:
CCS doesn’t violate the students’ privacy. We don’t share the reports with third parties, and you see that on the in-class pictures we send to the parents, all the faces other than their child’s are blurred out.
It begins innocently enough, as you can see in this exchange with a principal at a school that has installed the system:
Niulanshan’s principal, Wang Peidong, who has over 40 years of teaching experience, is also dismissive of CCS. “It’s not very useful,” he says. “You think a teacher standing on a podium needs AI to tell her if a student is sleeping in class?”
“Then why is it still used in your classrooms?” I ask.
“Zhang Haopeng is an alumnus of our school. He wants to do experiments here, so we let him. We don’t need to pay for it anyway,” he says, already walking away to get to his next meeting.
Don't think it can happen here? Think again: check out this eerily similar graphic for a rah-rah-data article in the Chronicle of Higher Education:


Labels in the Gradebook. Labels in the classroom. 

Wrong labels. Punitive labels. 

My sense of despair here is enormous, and I am glad that I will be retired from teaching before we have handed everything of importance over to our new data-driven robo-overlords.

Don't get me wrong. I believe in using data: small data, data that fits the educational experience: The Value of SMALL Data and Microassignments. We need to let the educational experience drive the data first. Otherwise, it's just wrong data. And big wrong data is dangerous data. You cannot let that data drive our education.


And speak out. Money talks, and money is ultimately what is driving all of these conversations. So, we cannot just talk: we have to shout! I'm going to keep on shouting. I admire the students in China who are disconnecting the surveillance cameras in their classrooms in protest, and I am going to keep on protesting the overreach of AI in Canvas and their (mis)use of my students' data (even if that conversation cannot happen in the Canvas Community itself).

More to come.



Wednesday, March 27, 2019

The Value of SMALL Data and Microassignments

I'm taking a few minutes this morning to write a post inspired by a Twitter convo yesterday with Yang Fan.


What I want to do here is report on my use of small data to help me in my teaching. I don't have any big data, and I don't want any big data.


Instead, I design my classes so that I get the good, small data I need to help my students make sure they are making progress so that they can pass the course.


It's not rocket science; in fact, it's very simple. That's the point! Here's how it works:

Microassignments. My system depends on using microassignments in my classes so that I can see how things are going from week to week and help students who are not making good progress. Everything in the Canvas LMS works against using microassignments (that's a topic for a separate post), but I am committed to microassignments; this approach is good for the students, and it's good for me too. I've written about this in my Canvas Community blog, and I've copied over two of those blog posts here:
Microassignments and Completion-Based Grading
Microassignments and Data Analytics

Weekly Data. So here's what my data analysis looks like right now, based on manually transferring my students' points week by week into a spreadsheet (because god forbid that the Canvas Gradebook, a faux spreadsheet, actually let me conduct my own analysis of my own data); this is at the end of Week 9, sorting from high to low for the number of weeks a student has been failing the class:


That little snapshot of data tells me a lot! It tells me that right now, out of 90 students in the class, there are 9 students who are not passing, and that has been steady most of the semester, with around 7, 8, or 9 students in trouble in any given week. For me, that's important; I know there are always going to be students who are struggling in a class. If I were seeing a lot more students struggling, I would need to think about some major class redesign, but at this level, I am more focused on working with the individual students rather than some larger course redesign effort.

Something else you can see there is that the cohort of struggling students has changed over the semester. There is one group of students who have been struggling from the very start, another set of students who have just recently fallen behind, along with a set of students who found a good routine for the class and are no longer in trouble. That's what I learn from sorting on the number of weeks students have been in the danger zone.

Data for communication. But the way I usually look at the chart is what you see below, focusing on the students who are currently having trouble by sorting on the current week's column:


Each week I communicate with the students who are currently in danger of not passing the class, but how I do that varies from student to student. I can eyeball the chart to see if they made good progress from the previous week, and if so I can send them an email of encouragement to keep on doing more of the same. By this point in the semester, I also know each student's story, so I can also take that into account when I write to them (e.g., overwhelmed by school, by work, by health problems, by life problems; every student has their own story). I have other columns in the spreadsheet that make it easy to get a fuller understanding of what is going on with each student because I have links to their class blog, their course project, along with their email addresses (I cannot stand the Canvas Messaging system and never use it unless I have to; that is also a subject for a separate post).

Data. Real Data. For me, this system works great. Every piece of work the students do for class is reflected here: their reading, their writing, their class project, their participation at other students' blogs and websites, all of their work for the course leaves a digital trail that is reflected here in those numbers. The numbers are not the work itself, but they are a good enough proxy, and I can then use the links right here in the spreadsheet to access the students' blogs and websites, seeing the work itself with a single click. For example, here is a link to my blog for the class as a student; the different types of assignments have their own labels so it's easy to browse the blog by type of assignments and/or by week of the semester. (I need to write up a post here sometime about my #TotalCoLearner experiment where I take one of my classes each semester as a student, doing all the work just like a regular student; it's so helpful!)

By contrast, the Canvas Gradebook is useless to me, so in another post I will chronicle in more detail the failure of Canvas to help me track the data I need, along with the uselessness of the data that Canvas provides me with instead. Sure, Canvas has data about my students (all those clicks! all those eyeballs!), but the data that Canvas collects is meaningless because it understands absolutely nothing about my course design. Not to mention that it knows nothing about my students and their stories.

But more on that later. For now, I just wanted to sing the praises of small data and how it makes it possible for me to keep up with my 90 students and their progress at a cost of just 10 minutes or so of my time each week.


Microassignments and Data Analytics


I've copied and pasted this old post from my Canvas Community blog in order to reference it here.

Microassignments. Each week of my class students have a set of 6 core assignments plus up to 8 different kinds of extra credit assignments they can complete (here's what a week looks like). Each assignment is worth 2 or 4 points depending on whether it is something quick (maybe 10-15 minutes) or something that takes more time (maybe 30-60 minutes). I ask students to spend appx. 5-6 hours per week on the class, which is the equivalent of 3 classroom hours plus 2-3 hours outside the classroom; the difference is that we have no classroom time since I teach fully online, so all the time that the students spend doing work for the class is active work: read, writing, interacting with each other. There are 30 points every week over a semester of 15 weeks. Students can decide whether they want to complete points for an A, B, or a C (here's how I explain that to them), but I don't get into any of that; I have literally no idea how many students in my class right now are headed for an A or B or C; the students record their points in the Gradebook themselves (here's how that works). My only goal is that everybody should pass the class with at least a grade of C, so that is the only thing I track, and in this blog post I want to show how easy it is for me to do that using a simple spreadsheet.

Analytics. So, on Monday afternoon, after each week is over, I go into the Canvas Gradebook and sort the total points from low to high. I manually transcribe the names and points of any student who has less than a passing grade (70%) for each of the three classes that I teach based on the total points so far. So, for example, last Monday was the end of Week 13. There were 390 points possible thus far, so I recorded the name and points of any student with fewer than 273 points. It takes literally just a couple of minutes to copy out the names and points, and then I paste that into my spreadsheet. You can see the result here; this shows all the students who were failing the class at any point during the semester, with their points in the weeks column, plus one column that tracks automatically how many weeks they were failing. That's literally all the data I need in a single screenshot to show how I measure course progress. The completely steady schedule plus the fine-grained microassignments make this a reliable set of measures. As you can see, some very stable patterns emerge here:


Here's what I see in that data:

Number of students failing. This is the total number of students who, in that given week (column) did not have a passing grade. As you can see, there is a large group of students there at the end of Week 2; a total of 16 students. These are the students who didn't understand at first that they really have to do work for the class every week. They slacked off in Week 2, and they could see immediately the results: not good! Of those students who were failing at the end of Week 2, 6 of them got on track and basically did not have any more serious trouble. Then the total number of students who were failing was pretty steady (between 8 and 11 students) every week up until Week 12. At that point, when they could see the end of the semester approaching, a few students really got their act together, and now I am down to just 5 students who are in real danger of not passing. That's out of a total of 90 students.

Failing weeks per student. This is another way to look at that same data in terms of the number of weeks in which students were failing the class. There is a group of 9 students that just spent 1 or 2 weeks in the danger zone, and of that group only 1 of them is in real danger now (they only recently started having trouble with the class). There are 2 students who spent 5 or 6 weeks in the danger zone, but they both pulled themselves out of trouble by Week 8 and did not have any trouble for the rest of the semester. Then there is the group of 8 students who have been in trouble for more than 10 weeks, and 4 of those students find themselves still in danger of failing now in the last few days of the semester.

These are the only students that I communicate with about their class progress. I send various kinds of assignment-based reminders to the class, especially at the start of the semester when people are developing their class routines, but the only students I communicate with about their overall course progress are the students who show up here on this spreadsheet, the students who are not currently passing. Every Monday after I transcribe the points to the spreadsheet, I send an email to the students who is failing. Sometimes it's a generic email that I send to the students BCC, but sometimes it's an email to the individual student. As I get to know more about them and the problems they are facing (lack of time, procrastination, personal troubles, etc. etc. -- each student's story is different), I can try to use what I know to write useful and encouraging emails that are forward-focused on what they can do to get on track for the class.

Note that this is just a small percentage of my students overall in the class. There are 90 students total, so that is fewer than 20% of the students who even appear here on the spreadsheet at any time, and fewer than 10% who have been seriously struggling. I'm still optimistic that all of them will pass, which is my goal for the semester. Last semester everybody passed, so that was a good semester for me. I'll post an update here on Friday when I see how this semester turns out.

UPDATE: WHOO-HOO! EVERYBODY PASSED! That makes me very happy. I also wrote up a post about end-of-semester evaluation comments from the students, including comments re: grading, here:
End-of-Semester Evals: Grading and Creativity


Microassignments and Completion-Based Grading


I've copied and pasted this old post from my Canvas Community blog in order to reference it here.

In my first posts in this series I gave an overview of my rejection of punitive "bad" grades and also my rejection of so-called "good" grades. In this post, I will provide an overview of my solution to these problems, where I give my students feedback about their work, but they do the grading.

I sometimes call this approach "all-feedback-no-grading" because, from my perspective, that is how it works: I give lots of feedback, but I never put a grade on anything. Students decide what grade they will get, not assignment by assignment but through their overall work in the class.

Again, this is just an overview, and I'll get more specific in later posts. Meanwhile, please feel free to ask questions, and that will help me know what to address in those future posts! I've been using this system for so long now (over 15 years) that it is totally familiar to me, and it's sometimes hard to gauge just what is self-explanatory and what actually needs explaining.

Microassignments

I use microassignments in my classes. I made up the term microassignment to convey the idea that there are no big, high-stakes assignments of any kind. Some of these little assignments take just 10 or 15 minutes to complete; others might take as much as an hour, but not more than an hour — unless, of course, the student gets excited and wants to spend more time; sometimes they do, especially when they are working on their class project.

I advise students to budget a total of 5-6 hours to spend on my class each week; how they schedule that time is totally up to them. Because the assignments are small, students can work on them in short snatches of time, or they can sit down and complete several assignments in a longer study session; it's all up to them. I love teaching online for just that reason: I am glad to take advantage of any time the students can find for this class.

Success is the sum of small efforts,
repeated day in and day out.

Gradebook Declarations

As students complete each microassignment, they record the completed assignment in the Gradebook using a "Gradebook Declaration," which is actually just a true-false quiz. The quiz contains a checklist of all the requirements for the assignment to be complete, which is more or less detailed depending on how complex the assignment is.

I include the checklist text in the assignment instructions, as you can see here: Week 1: Storybook Favorites

The Declaration checklists are also a good reminder to them about exactly what they need to have done for the assignment to be complete. Students answer "true" to indicate the assignment is complete and, presto, the points go into the Gradebook.

There is no partial credit; students get credit for completed assignments only. If they start an assignment, but do not have time to finish it, they can finish it the next week; everything rolls forward that way, so no work is lost.

The students do all this grade-work themselves. As they complete each and every assignment, they do a Gradebook Declaration. They find it a little strange at first, but they quickly get used to it. Admittedly, getting the students to slow down and read the Declaration sometimes takes a little work on my part at first; a few students start out treating the checklists as a kind of terms-of-service which they agree to without reading, but when I follow up with them about that, there are no further problems. Because every assignment leaves a digital trail at their blog or at their website, there is clear accountability. They know that; I know that. During the Orientation week, I watch all the blog posts carefully to make sure students understand how the system works.

Student Choice

There are many assignments students can choose to complete each week. Take a look here for a typical week:
Week 3: Myth-Folklore and Indian Epics.
(I have the same assignments in both classes; it's just the content that is different.)

As you can see there, each week has six "core" assignments which provide a week-long workflow: two reading assignments, a storytelling assignment based on the reading, blog commenting on other students' stories, a semester-long project, plus feedback on other students' projects. Most students complete most of the core assignments. Those assignments add up to a total of 30 points each week.

In addition there are eight "extra" assignments, and those add up to 20 points each week. Students can use those to make up any of the core assignments they missed. They can do extra assignments if they want to do more of something (more reading, more commenting, more technology, etc.). Students can also use the extra assignments as a way to accumulate points if they want/need to finish the class early. It's all good.

So, there are 50 possible points each week, but there is no expectation that students would do all that work. The idea is that they CHOOSE what assignments to do. They can focus on the core assignments and only on the core if they want, or they can mix in extra assignments. They can also work ahead if they want. It's all up to them.

Class Progress

As the points accumulate week by week, students can see if they are on target to reach their desired grade. Some of my students want/need to get an A. Some of them just need to pass the class with a C to graduate. Some of them can even take a D and have the class count for graduation; I always tell them to check with their advisor about that, though, because Ds do not always work for Gen. Ed. credit or for financial aid. Here's the chart they can use to track their progress: Progress Chart

My own goal is just that every student should pass the class. As far as I am concerned, this is a P/NP class. Whether a student wants to get an A or B or C makes literally no difference to me, and I do not know until I check the Gradebook at the end of the semester who is getting what grade; I only monitor the Gradebook for students who are in danger of not passing the class. More about my DIY data analytics here:
Microassignments and Data Analytics

Yes, It Works!

Yes, this is all kind of weird... but the students really like it. Here is every comment students have made about grading in my end-of-semester evaluations since we went digital back in 2010: Grading: What Students Say

I've been using this system, largely unchanged, since I started teaching online in 2002. The reason I haven't changed it much is exactly because of that student feedback: it works. Students like it. Students REALLY like it. And they like it for the reasons that are important to me: they feel in control of their grade, they are not stressed, it encourages them to be creative and learn a lot.

When I introduce this admittedly strange grading system to the students in their very first assignment for the semester, I include a link to those student comments. I can go on (and on and on) about the advantages of this system, but the most powerful words come from the students themselves! Here's how I introduce all this on the first day of class:
Designing Your Course

Thoughts from a Brand-New Student...

And since some of my students have started already for Spring 2019 (flexible schedule also means starting early if they want), I will share this screenshot of a blog post that popped up literally just a few minutes ago. I think this says it all; one of the Orientation Week assignments is for the students to let me know if this all makes sense and what they think. Here's what one of the students is thinking right now, at this moment. And it sounds good to me! This student understands not just how the course works but why it is set up this way, and I am excited to see what she will do with the reading and writing as she moves on to Week 2, and I'll see how that goes right there in her blog.


And maybe that should be the subject for my next post: how blogging and other visible student work is an important part of the shift from grading to feedback! More on that tomorrow. :-)

Sunday, March 24, 2019

The Paradox of Canvas's "Big Data" and Lack of Search

If you haven't read Phil Hill's piece about Instructure's new pursuit of machine learning, start there:
Instructure: Plans to expand beyond Canvas LMS into machine learning and AI

In particular, note Goldsmith's claim about the Instructure database: he says it is "the most comprehensive SaaS database on the educational experience." I'll set aside for now the very depressing view of "educational experience" which Goldsmith is promoting here (but more on that later), and I'll also set aside the claim about "most comprehensive" except to note that it sounds scarily similar to the boasting of Jose Ferreira, Knewton’s founder and former CEO: “We literally know everything about what you know and how you learn best, everything” (more at Marketplace).

What I want to emphasize here is the stark contrast between what Instructure is doing with data for its own purposes and the data it denies to teachers and students who are trying to use Canvas LMS: specifically, the fact that you cannot search the content of a Canvas course.

That's weird, right? 

It's very weird. Read on to find out more.

Seek and Ye Shall Not Find: No Search in Canvas

One of the biggest advantages of digital content is that it is searchable. So, you would think that if teachers go to the trouble of using Canvas tools to create content, then they would be able to search the content they create, and the students would also be able to search the content.

But... you cannot search your content in Canvas. You can create Pages, sure, and you can use the "rich content editor" in order to do that. But you cannot search that content, and your students cannot search the content either. 

Here's what the Pages look like to a student. No search box. No searching.


Here's what the Pages look like to the teacher: again, no search box. Add a new Page? Yes. Edit? Yes. Delete? Yes. Duplicate? Yes. But... Search? Nope. Thou shalt not search.


You have to remember everything yourself. Memorize. With your own brain. Because the Canvas database, big though it may be, is not going to help you here.

Given all the other problems with Canvas Pages (no folders, broken links if you try to rename a page, etc. etc.), I cannot imagine why anyone would actually choose to use Pages to develop content, and I feel really badly for all the teachers who, because of school policies, are required to use Pages for their content.

Back when Canvas was just a scrappy LMS built on a shoestring budget, sure, I guess it sort of maybe kind of made sense that they skipped the search feature of the Pages area. Although even that is still very strange IMO. 

But Instructure's CEO is now making claims about how big their database is and all the data mining they are going to do... while teachers and students still cannot even search their own course content.

Project Khaki Did Not Deliver

This problem was supposed to be fixed after the Project Khaki back in 2017 when search was one of the features voted up and supposedly a priority for engineering resources. (Disclosure: I participated in Project Khaki that year.)

But that Project Khaki commitment led nowhere. The engineers scoped the project as "global search," and then they decided that they did not have the resources available to implement global search. Did they rescope the project? Like maybe the ability to search course content in Canvas Pages? Nope. They did not rescope the search project. It's just... deferred. And is there any timeline about the availability of search coming soon? So far as I know, there is no such timeline. Which means it is now the year 2019, and neither teachers nor students can search their own Canvas Pages to find something they are looking for.

Of course, if you can pay...

Given that search is a real need, perhaps it is not surprising that, yes, there is a third-party vendor, Atomic Jolt, that is willing to provide a Canvas search feature for you. Just look at all the happy Atomic searchers: laughing and smiling. Atomic Search users can search for content, they can save time... except they have to pay more money to do that. 


The existence of this paid service does prove one thing: schools really want to have a search option. So much so that they are willing to pony up additional money to pay another company simply in order to be able to search their course content and allow their students to do the same. (My school does not pay for this service.)

The Google Work-Around

The irony is that you can use a Google work-around: just open up your Canvas course — and, yes, this is my favorite thing about Canvas: you really can open up your course with real URLs, linkable and searchable. Once you do that, Google will be glad to index your content and return results via Google search. So, because I keep all my Canvas spaces fully open, I can use Google as my search engine even if Canvas will not let me search. 

Each course has its own distinctive subdirectory, so I just need to add site:canvas.ou.edu/courses/54178/ to search my Widget Warehouse course site for any term. For example, if I am looking for cats, I can search like this:



You can also do this across a school. So, for example, adding the search delimiter site:canvas.ou.edu/courses to a Google search will search all the open course content at my school. The problem is that very few instructors (how many? I don't know) choose to open up their courses. All the cat results you get this way are still my cats, except that now these are results for all the cats in my Canvas Pages across all my OU Canvas course spaces:



Yes, Google is also extracting its own value from this search to build its advertising empire, but at least it is also returning some value back to me in the ability to perform my own searches.

Instructure, meanwhile, is extracting value from our course content as part of its machine-learning pipe dream, but they are not even letting us perform our own searches of that data.

Plus, Instructure is also missing out on one of the best possible data sources as a result: if they did let us search, they could learn a lot and share what they learn back with us.  Assuming that Instructure is willing to let us make use of our own data, in this case our own search history. Which is a very big assumption. And one I am feeling completely pessimistic about at this time.

LMS: Undermining Digital Literacy

One last point: I've argued before that the LMS is bad for digital literacy, and this lack of search is a perfect example. For students to become skilled users of digital tools, they need to use real tools, and the lack of search in Canvas shows how it fails as a real web tool. Search is one of the key components of digital literacy, but Canvas doesn't allow students to search, which further means that Canvas does not help students to learn how to search well.

So, while Instructure is busy mining our data supposedly in order to further our education, it is at the same time depriving us of one of the key educational tools that we need.

The year 2018 went by without a search feature in Canvas Pages.
I wonder where we will find ourselves at the end of 2019...?
I promise to update this post if/when news is available.

Meanwhile, you can learn lots more about web literacy, and about search in particular, at this nifty resource from the Mozilla Foundation: Web Literacy: Search


Plus there's a Latin LOLCat who knows all about the power of search... and, yes, I found this cat at my Latin LOLCats blog by using the search feature there. :-) 

Quaerendo invenietis.
By seeking you will find.





Saturday, March 23, 2019

Audrey Watters and Getting Back into the Groove

So, I'm back home after a very long week in Austin (but a good week: more about my dad and his xylophone here)... and I did have the luxury of lots of reading time while traveling, especially on the way home with a long layover in DFW. Outbound, I read Cathy O'Neil's Weapons of Math Destruction, which was fantastic (more on that later), and then on the way home I found the perfect antidote to my blogging blues: I read all four volumes of Audrey Watters's MonstersThe Monsters of Education TechnologyThe Revenge of the Monsters of Education TechnologyThe Curse of the Monsters of Education Technology, and The Monsters of Education Technology 4. And WOW: that was exactly what I needed to read.


For those of you who haven't seen this series, for the past several years Audrey has curated and published a collection of her keynotes, covering a huge range of issues related to both education and technology. She also publishes the keynotes at her blog (with all the graphics too), but it's a very different experience to read them all together like this, especially all the books together, seeing her main themes develop in different ways while new themes also come into focus.

And a big dose of Audrey that was exactly what I needed to get back into the groove of blogging here. I spent the past two years doing my best to make a useful contribution to the Canvas Community, publishing a couple hundred blog posts there, along with developing various open Canvas resource-courses. I was persuaded to do that because Canvas, unlike D2L and Blackboard, has an open architecture with real URLs that allow people to open up their courses so that others can look and link (although, sadly, almost nobody does that; you can see my own courses at Myth.MythFolklore.net and India.MythFolklore.net) and Canvas also makes it possible to use web technologies like RSS and javascript and iframe to bring external content into the Canvas system (so I built a Canvas Widget Warehouse full of javascripts for anyone to use, plus tutorials Twitter4Canvas, etc.).

Writing there at the Community, I really felt like I had a contribution to make, bringing what I had learned about open education practices to the community of people using Canvas (often required by their schools to use Canvas and nothing-but-Canvas), sharing my Canvas hacks and also learning from others at the Community. It's telling, I think, that Instructure does not use Canvas for this learning community because Canvas has no good community features, which has always been one of my complaints: even though Canvas does have some features that make it different from D2L and Blackboard, it is fundamentally a quizzing-and-grading system, and Instructure has devoted the lion's share of development resources over the past several years to the quizzes and gradebook. As a space for social learning, Canvas is terrible, and one of my big blogging projects at the Community was documenting how I use a blog network both for all the work my students create and share, and also for all the content that I develop. I've always said it was a big mistake for instructors to try to rely on the content development tools inside Canvas (the content tools are incredibly limited), and there are no real content development tools for students at all in Canvas, just the very tired, very old discussion board.

Despite the fact that I had made these criticisms of Canvas openly and often during my two years participating at the Community, my contributions were still welcome, and Instructure even brought me to the InstructureCon conference last summer as a kind of community booster, someone trying to help people connect with the network of learners using the Canvas Community space. But then, ten days ago, when I criticized the recent announcement by the new CEO Dan Goldsmith about the advent of AI, machine learning, and algorithms, the Community Managers decided that I had crossed a line, violating the rule that everything we post at the Community must be "uplifting" ... or else. That's why I'm now blogging here again after a long hiatus, and I posted a much more frank post about Goldsmith's vision for Instructure here at this blog: My Soylent-Green Moment: Instructure and the Future of Canvas.

So, as I was reading Audrey's books yesterday, this statement really jumped out at me — well, LOTS of statements jumped out at me, but this is what helped me frame what had happened to me at the Canvas Community and their rule of always-uplifting-all-the-time (bold emphasis is mine):
These are rarely topics addressed at education technology conferences; even the topics that might be most directly pertinent to the field – information security, institutional sustainability, for example – are brushed aside, in part I would argue, because education technology has worked quite hard to repress the trauma and anxiety associated with the adoption of new technologies and more broadly with the conditions, the precarity, of everyday life. Education technology has become so bound up in arguments about the necessity of technology, that it’s forgotten how to be anything other than the most loyal of servants to ideologies of machines, efficiencies, capitalism. It’s always sunny in education technology. Its tools are golden and transformative; its transformation, its disruption is always profitable and progressive. Better. Faster. Cheaper. Shinier.
You can read more at Audrey's blog; the quote is from this piece: Re·Con·Figures: The Pigeons of Ed-tech. And I highly recommend that you do read the whole thing; I think this is one of her very best pieces. 

So, I did my best to be sunny at the Canvas Community, contributing to the uplift, but at a certain point, there's just too much trade-off, too much compromise, too much important stuff that gets swept under the rug or shoved into the closet. It's time to talk about the trauma and anxiety, facing it honestly and figuring out what we can do to fight back. 

Stay tuned. :-)




Thursday, March 21, 2019

Digital Resources and Analog Music

I haven't been able to blog this week because it's our Spring Break and I'm at my dad's in Austin -- yes, we are now in Year Two of what was supposed to be just a few weeks of hospice care for lung cancer (he was diagnosed in October 2017 and went on hospice care in January 2018), and the hospice miracle continues. I come to Austin every month, and something really cool happened this visit which I wanted to write about here: my dad is learning to play the xylophone! At age 91!


Even though he loves music, he never learned anything about music in school or as a hobby. He can't/won't sing, and every time I suggested to him getting some kind of instrument to play, he was completely negative about it. "I have no musical talent," he insisted.

Well, of course I know that this whole "musical talent" thing is just a fixed mindset myth: every human being is born ready to be musical, but sadly that musical impulse needs support and encouragement, and it can easily be squashed by unkind words or shaming, or simply by neglect.

What happened this week was that the hospice chaplain brought her mandolin to play some music for him, and he was fascinated by watching her play. And what a beautiful instrument! I had never seen a mandolin up close before myself. One of the songs she played was John Denver's Country Roads, and that's a special favorite for my dad.

So, even though my dad insisted that he didn't want an instrument and said that he was disgusted by the whole idea, I made him an executive decision ordered him a xylophone anyway! And................


.............. it worked! He is absolutely delighted by the thing and is already learning two songs: Mary Had a Little Lamb (which is all CDE and G) and Row Row Row Your Boat (which is CDE and F and G). We can't get him to sing along while he plays, which makes it harder, but no matter: there's no wrong way to get started, and there is no wrong way to continue... because he's MAKING music.

His xylophone is a very simple 13-note set of keys with no sharps and flats, and I also ordered a xylophone for his main caregiver (who loves music and sings beautifully) which has a full two octaves, plus sharps and flats.


And it was so cool to see how he is really curious about her xylophone too: he can see that it is similar to his because it has the same white keys that were the same letters (plus a few more), and then it also has all those black keys. This morning he played around on that just listening to the sounds to hear what he could hear. And he accidentally started playing Frere Jacques on his own, and he recognized that's what he was playing! How cool is that???!

Plus, he could also see from her xylophone that this is like playing the piano, but without any finger work: the black and white keys are indeed like keys on a piano. Of course I told him that if he wanted to learn to play a piano keyboard later, we could get him one of those too, and then he could play any melody he wants.

One of the joys of the xylophone, though, is that it is not electric at all. This is good old analog music, and that analog experience means you hear changes in the the resonance of the key based on how you hit it, how you hold the mallet, etc. When you get a good hit, the sound really is lovely, like wind chimes.

And, as promised in the title of this post, there is a digital part to this analog music story too. My dad's xylophone is meant for kids and it came with some simple children's songs written out with the lyrics plus the letter notes, easy to read and follow along (you can see the music cards in the picture above).

But there's no reason he should be limited to children's songs! These thirteen notes should be good for all kinds of music, right? So I went looking online for a source of music for beginners without any musical notation because I would never dream of trying to teach him to read sheet music. I wanted just the lyrics and the letter notes, and I found this wonderful website: NoobNotes.net. Here's how it presents a song, just the letters, with a little dot to indicate the lower octave and a ^ caret for the higher octave:


Not only does it have a wide selection of all kinds of songs, but it also has a feature that allows you to transpose, which meant I was able to go through the songs, adjusting to see if I could find something with a range between low G and high E, and with no sharps or flats, which would mean a tune that he could play on the little xylophone. When I found one that would work, I printed it out to add to our music collection.


What a treasure trove! I found some of his favorite songs like Always on My Mind, Can't Help Falling in Love with You, Love Me Tender, and Wonderful World, plus many more. All conjured up with just 13 notes! In total, I found 23 songs at the site that are favorites of his and which fell in that 13-note range. Including... John Denver's Country Roads. So I alerted the chaplain that if she didn't mind playing Country Roads in his key, they could play together.

And that is the power of digital, letting me search for songs and transpose, with a great visual presentation that immediately alerted me to the presence of sharps and flats. Digital search in the service of analog experience!

So, I am really happy with how this trip has gone: I thought my main task this time was going to be to do my dad's taxes. And yes, I did the taxes (ugh), but this musical breakthrough has turned out to be the real success story. My dad has been given an incredible chance to really put things in his life right over the past year, and I am so glad that making his own music is a part of that miracle story.

Here is one of my favorite songs that you can play with just those 13 notes: Morning Has Broken.


Make music, people!


Saturday, March 16, 2019

My Soylent-Green Moment: Instructure and the Future of Canvas

This past week ranks as one of the worst weeks of my professional life: I learned that Instructure is going to be using (is already using?) the data collected about students in Canvas for machine learning and algorithms. I'm still completely shocked. If you haven't read the statements by Instucture CEO Dan Goldsmith in this report by Phil Hill, here is the article:
Instructure: Plans to expand beyond Canvas LMS into machine learning and AI

It's a kind of "Soylent Green" moment for me, realizing that a company and a product in which I had put a lot of faith and trust is going to be pursuing an agenda which I cannot endorse and in which I will not participate.



In this blog post, I'll explain my understanding of the situation, and then close with three main concerns that I have. There will be many more posts to come, and I hope those who know more than I do about machine learning in education will chime in and help me further my own education about this grim topic.

The Now: Canvas Data for Classes and Schools

I've not been impressed by the current Instructure data analytics since their approach is based only on surface behaviors, with no attempt to ask students the "why" for those behaviors (for example, short time spent on content page: because the student is bored? because they are confused? because it was the wrong page? because they have limited time available? because they got distracted by something else? etc.). Yes, Instructure collects a lot of data from students (all those eyeballs! all those clicks!), but just because they have a lot of data does not make it meaningful or useful. Speaking for myself, I get no benefit of any kind from the "Analytics" page for each student in my class that the Canvas LMS wants to show me:



I know that some schools also use the data from Canvas on an institutional level, but that's not something I know a lot about, and I also know there are commercial products, like Dropout Detective, that help schools extend their use of the data in Canvas. Just how a school tracks and uses the data it gathers about its students is for each school to decide.

At my school, for example, there is a strong presumption of student privacy when it comes to enrollment and grading data, as you would expect from FERPA. As an instructor, I use my students' ID numbers to report the students' grades (I am required to do that at the end of the semester, and I am urged to report midsemester grades, but not required), and that is all I can do. I cannot find out what other courses a student is enrolled in or has enrolled in, nor can I find out a student's grades or GPA.

And that is how it should be: it is not my business. Yes, that data exists. And yes, in some cases that data might also be helpful to me in working with a student. But just because the data exists and might be helpful does not mean that I can use it. The student starts with a fundamental right to privacy about their enrollment and grades, and it is up to the school to make decisions about how that data is shared beyond the classroom, like when advisors are able to look at a student's courses and grades overall, or aggregate analysis, like the way the university publicly reports on the aggregate GPA of student athletes, for example.

The Future: Instructure Robo-Tutor in the Sky

So, while my students' performance in their other classes is not my business, Instructure has decided to make it their business. In fact, they have decided to make it the future of their business. Goldsmith is emphatic: the Instructure database is no longer about data reports shared with instructors and with schools. Instead, it is about AI and machine learning. Instructure is going to be using my students' data (my students, your students, all the students) in order to teach its machine to predict what students will do, and then the system will act on those predictions. Quoting Instructure CEO Dan Goldsmith (from Phil's article, and yes, if they do have "the most comprehensive database on the educational experience in the globe," well, that's because we gave them all our data):


Welcome to your worst education nightmare: they are going to lump together all the data across all the schools, predict outcomes, and modify our behavior accordingly... thus sayeth Dan Goldsmith:


In future posts, I'll write in more detail about why this is bound to fail. The hubris here is really alarming; it's as if the executive team at Instructure learned nothing from the costly failures of other edtech machine-learning solutionists during the late, not-great era of the MOOCs. Back in February 2019, Michael Feldstein had speculated that this kind of hype might be subsiding (Is Ed Tech Hype in Remission?), but here we are just a few weeks later, and the hype is strong. Very strong.

Three Concerns

For now, I have three concerns I want to focus on:

1. What exactly did I agree to? To my shame, I put a lot of trust in Instructure, so it is indeed true that I clicked a checkbox somewhere at some point without reading the privacy policy and related legal policies. My students clicked such a checkbox too. At the Instructure website there is a Privacy Policy that relates to personal identifying information (you can access that from the Canvas Dashboard), and once you get to the Instructure website, you can also find an Acceptable Use Policy, but it seems primarily focused on indemnifying Instructure from wrongdoing by users (illegal content, objectionable content, etc.). I'm not a lawyer, but I guess it all hinges on this: "Instructure reserves all rights not granted in the AUP Guidelines." That sounds like they can use all the non-personally-identifying data as delimited in the separate privacy policy in any way they want, is that right?

They do state that they "respect the intellectual property of others and ask that you do too," but it's not clear at all if they regard all the content we create inside the system (assignments submitted, quizzes created and taken, discussion board posts, etc.) as our intellectual property that they should respect and not exploit without our permission. Hopefully someone who knows more than me can figure out how this AUP compares to the kind of terms-of-service that are being used by a company like, say, Coursera, which from the start was committed to machine learning and exploitation of user content in the system.

I don't know what the Coursera terms-of-service looks like now, but back when they first got started, they were very explicit about reusing our content to build their machine-learning system, as I wrote about when I took a first-generation Coursera course back in 2012: Coursera TOS: All your essay are belong to us. See that blog post for language like this: "you grant Coursera and the Participating Institutions a fully transferable, worldwide, perpetual, royalty-free and non-exclusive license to use, distribute, sublicense, reproduce, modify, adapt, publicly perform and publicly display such User Content," etc. I didn't see that kind of language in the Instructure policies, but I'm honestly not sure where to look.

Instructure does have a "Privacy Portal" with a cutesy graphic (visit the page to see the curtain being drawn and clouds of steam arising from behind the shower curtain). I thought the text in bold beside the graphic would be links leading to more information, but they are not links. There's a privacy policy, an acceptable use policy, and a data processing policy linked across the top of the page, but I don't see something labeled "terms of service" like what Coursera had in place. The shower curtain is labeled "privacy shield." Yeah, right.


2. What about opting out? Without an opt-out, Instructure is putting us in an impossible situation, way worse than with TurnItIn, for example. If a student insists that they will not use TurnItIn (as I think every student should do: just say no!), then it's easy to find work-arounds; teachers would just have to read the student's work for themselves without robo-assistance. But if a student says, no, they will not use Canvas because they do not want their data to be exploited for corporate profit, then that puts the teacher in a really awkward position. If you put all your content and course activities and assessments inside Canvas and a student does not want Instructure to use their data, what can the teacher do? It seems to me that Instructure needs, at a minimum, an opt-out for people who do not want their data to be used in this way by our new corporate overlords. Even better: it could all be opt-in, so that instead of assuming students and teachers all want to give their data to Instructure without compensation, you start with the assumption that we do not want to do that, and then Instructure can persuade us to opt in after all.

3. What about FERPA? Right now instructors at my school can put grades in Canvas for institutional reporting purposes (although I actually put mine directly into the SIS instead because the Canvas grading schemes can't accommodate my course design). My school then controls very strictly how that grade data is used, as I explained above. Now, however, it looks like that grade data is something that Instructure is going to be mining, at the course level and at the assessment level, so that its machine-learning engine will track a student's performance both within classes and also from course to course, analyzing their grades and their related data to create the algorithms. To me, that seems like a violation of privacy. In legal terms, perhaps it is not a problem because they are anonymizing the data, but just because it is legal does not make it right. We are apparently giving Instructure extraordinary freedom to take our students' grades and supporting work in order to exploit that not just beyond courses at an institutional level but, as Goldsmith stated (see above), across institutions in ways that will be totally beyond our control. It's like TurnItIn profiting from our students' work (to the tune of 1.7 billion dollars, also in this week's news) without any form of compensation to the students, but way worse. WAY worse. It's not just the students' essays now. It's... everything. Every eyeball. Every click. Teachers and students alike.

Of course, I know Instructure, just like TurnItin, will hire the lawyers they need to make sure they can get away with this. But how sad is that? I never thought I would write a sentence that says "Instructure, just like TurnItIn" ... and yes, I'm angry about it. Angry at Instructure for squandering money, time, and people's trust on what will turn out to be hype rather than reality (but more on that in a separate post). I'm also angry at myself for having put so much trust in Instructure. When I expressed my anger at the Canvas Community this week, I was told that my opinions violated the Community Guidelines which require that everything we post there be "uplifting," so that is why I am back blogging here again after blogging for a couple of years at the Community. I have nothing uplifting to say about the new turn Instructure is taking, and I need a blog space where I am free to say that I am angry about this.

Human Learning

But every cloud (including a SaaS cloud) has a silver lining. I am now going to take my casual layperson's knowledge of machine learning and predictive algorithms in education (mostly gleaned from reading about robograding of student writing) and learn more about that. If the machines are learning, we better get to work on our own learning too! And hey, perfect timing: it's Spring Break and I'll be spending two days in airports. Which means two days of reading.

I'm going to start with Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy by Cathy O'Neil.

Then what should I read next? Let me know here or at Twitter (@OnlineCrsLady).