I’m not confused I lost my glasses

I am always fascinated and creeped out by these stories about adapting system behavior to user emotion. The system described here is being tested out by analyzing facial expressions to detect engagement with educational materials which are then used to predict test performance. I’d love to see some extracted data of what engaged expressions look like. I’ve had too many conversations with colleagues where I’ve asked “You teach X a lot, is that angry look they get their thinking look?” to expect that engaged expressions must look like entertained or pleased expressions, and I know my students have that conversation about my own facial expressions as well. The applications of this also seem significantly more useful (and easier to consider managing the flow of personal data about one’s emotions) if such a system were embedded in one’s own computer and thus tuned to the vagarities of one’s own facial expressions.

I am sure the intended use for such a tool would be online educational materials, whether from a flipped classroom setting or a MOOC or what have you. But I can’t help but picture physical classrooms fixed with cameras at the front of the room, scanning all of the students and registering real-time engagement graphs on a lectern at the front. So file this away, along with Google Glass, as another piece of evidence we’ll be seeing camera-blocking devices, or straight-up masks as a fashion accessory, becoming more prominent in the coming decades.

Leave a Reply

Your email address will not be published. Required fields are marked *