So, this wasn’t the post I had intended to write today. But it’s what has come front and center, so I guess it’s the one that will get written. Today the folks at Instructure (makers of the Canvas learning platform) released an Alexa Skill for Canvas that allows faculty, students, or parents to query Alexa for Canvas based information/activities. In the parlance of today’s politicians, I am troubled by this.
If you haven’t seen the announcement, here’s what the company posted on Twitter:
— Canvas LMS (@CanvasLMS) July 27, 2017
The announcement itself is pretty short and light on details, which isn’t necessarily unexpected from a press release during a company’s annual conference. The technical piece of the announcement is nerd cool for sure. It’s the implications for student data privacy that are giving me heartburn. And now here’s the part with some disclaimers. I am not an attorney. Nothing I say here should be construed as legal advice. If something I say here concerns you as well, you should check with your institution’s legal counsel. I do, however, have 20+ years of experience in higher education IT, so I have more than a passing familiarity with the issues I’m going to discuss.
Before getting into the crux of the matter, a little higher ed legal primer (just in case). Almost everything we do in higher education is in someway shaped by a law (and associated regulations and guidance from the Department of Education) called FERPA (the Family Educational Rights and Privacy Act). When you boil it down to it’s essence, it says the institutions are not allowed to share any part of a student’s educational record with outside parties unless those parties are defined as actors for the institution or the student consents to the release. There is a good deal more to the law than that, but for this discussion I think it’s a sufficient overarching description. At our institution, we have defined Instructure as a third party actor. This is what allows us to use Canvas as our learning platform. Without that we wouldn’t be able to enroll students in Canvas courses or let them submit assessments, or have faculty grade them, or etc, etc, etc. As part of our relationship with Instructure, they have policies in place that say they will follow FERPA as well and protect our FERPA protected data.
1.3 Alexa Interactions. You control Alexa with your voice. Alexa streams audio to the cloud when you interact with Alexa. Alexa processes and retains your Alexa Interactions, such as your voice inputs, music playlists, and your Alexa to-do and shopping lists, in the cloud to provide and improve our services. Learn more about these voice services including how to delete voice recordings associated with your account.
In addition, there is a section about Alexa Calling and Messaging Schedule that says:
While that second one is in a specific section for Calling and Messaging Schedule, it does refer generally to Alexa interactions, so my read is that any Alexa interaction is stored in the same way as described in the section above. If somebody knows for sure and would like to contact me or leave a comment, I can correct this.
So what we have now, to my eye, is a Canvas service that takes a voice request from Alexa (which is stored by Amazon) and sends back data to me in the form of a voice response (again, which is stored by Amazon). I’ve been able to find no information on the Amazon site about their compliance with FERPA, I can’t find any information on any specific agreement between Instructure and Amazon regarding this, and I know my institution hasn’t declared Amazon an institutional actor for the purposes of FERPA. But why am I presenting this as FERPA issue? Well, looking at the announcement, many of the queries to which Alexa will respond are things that are considered part of the student’s educational record (like details of courses). Given that, what can I logically say about FERPA protected data in this case:
- If a parent enables the interaction (assuming your school allows parents access to Canvas), things are less clear (at least for higher education institutions). If the student is considered a dependent, then my understanding is that the parent can act for the student in cases like this. So it’s really just case #1 repeated. If the student is considered independent, they can’t. In that case you potentially have a parent who is not authorized to release student information doing so. So it will boil down to which students your institutions considers dependents and which they consider independents (how you determine that is way outside the scope of this post).
- If a faculty member enables it, they are likely releasing FERPA protected data about a student without the student’s consent to a third party who is not an institutional actor.
So where does this leave my institution? It’s worth noting that I have seen many projects derailed by unfounded FERPA concerns – I call it beating a project to death with the FERPA stick. But I think in this case the concerns are well founded (as I’ve tried to articulate above). So for now we probably won’t enable this service until we’ve had a chance to have a very detailed conversation with Instructure about it. And while all this is of concern, it’s not even the thing that I find most troubling. This is where we move from Big Data to Little Ethics.
There’s been nothing official I’ve found yet about the process for developing this new service, but I did see this tweet:
Yep, and it’s available now. Came out of a Canvas hackathon. And the developers worked on their own time to build it out.
— Mark Orlan (@MarkOrlan) July 27, 2017
As I understand it, the Canvas hackathon is both time at the conference to work with the developers on specific issues and Instructure’s internal program where engineers get a week a couple times a year to work on any Canvas related project they want to. As I mentioned above, from a purely technical aspect, this is a neat thing that came out of that. But where was the moment where the engineers (or engineer singular) working on this thought about whose data was going to be sent to Amazon? When did a supervisor or product manager askew this might affect student agency? At what point did someone get feedback from the community about this service? Right now I can only speculate at answers to those questions, but given the product I see today what I can infer doesn’t say good things about the consideration given to the ethics of the situation.
Instructure is not alone in this. Yesterday Irobot, the company that makes the Roomba robotic vacuum cleaner, announced it was planning to sell information about the inside of your home to third parties. Legally they can almost certainly do it because if you have one of these and enabled certain cloud options, you gave Irobot the rights to your data (again, it’s in the TOS you didn’t read). And that’s just two examples in the last two days. It’s part of a disturbing trend where technology companies (and frankly other kinds of companies too) are concerned more about whether they can do a thing then whether they should do a thing.
Institutions aren’t off the hook either. If you decide you’re legally fine enabling the service, you’ve answered the “can do the thing” question. What about the “should do a thing” question? At a minimum, how will you help students, parents, and faculty understand the implications of the decision to enable this service and agree to another long terms of service agreement they didn’t read? How will you determine how this affects student agency over their data? As I said yesterday on Twitter, if anyone is wondering what the value of a liberal arts education is, it’s producing college graduates who are prepared to have these kinds of questions at the front of their minds and ready to grapple with them before building products rather than an afterthought (or as a reaction to some random person blogging about the issue). It’s the combination of big data with little ethics that really troubles me.