ACM Logo  An ACM Publication  |  CONTRIBUTE  |  FOLLOW    

Expectations of Privacy
Data Collected In Class Should Not Be Misused

By Lisa Neal / September 2003

TYPE: OPINION
Print Email
Comments Instapaper

In a recent Boston Globe column about transponders, well-known Harvard Law School professor and author Alan Dershowitz said, "The effect is to provide them with more useful information about us. We become less and less sensitive to the expectation of privacy." Transponders, which many of us place on our car windshields so we can bypass toll booths, collect information about when and where we drive. What do transponders and the Massachusetts Turnpike Authority have to do with technology-assisted learning? Very little, except for the similarities in how capturing information offers benefits as well as the potential to infringe on privacy.

A few weeks ago, I taught some synchronous sessions using Centra and LearnLinc, respectively. Each allows a certain level of monitoring, in the sense that the session leader can ask questions and see not only who responds, but how quickly they respond. Similar capabilities can be found in traditional schools—according to another recent article in The Boston Globe, handheld devices that allow students to respond to questions are now prevalent in the large lecture halls of the University of Massachusetts. Whether in the classroom or online, the benefits are similar: The instructor finds out what material is understood by students, and students get the answers they need. At the same time, the instructor "sees" who is listening by the speed of response. There are relatively few distractions in a large lecture hall. In an online class, however, multitasking is a common temptation. It is all too easy, as I know from my own experience--although I shouldn't admit it--to get distracted by email, phone calls, and urgent tasks.

Neither the large lecture hall nor the virtual classroom approaches the richness of interaction possible in a small face-to-face seminar, where a teacher is acutely aware of both the furrowed brows and all indications of "getting it." But in the virtual classroom, keyed-in responses, virtual hand-raising, and feedback indicators provide this essential information to the instructor--information that has traditionally remained private. And all this idiosyncratic, subjective, and contextual data is captured and recorded. Such information is always available to the instructor; the issue is whether others should have access to it as well, and if so, how should it be used?

When taking a course, students implicitly agree to the instructor's rules. Instructors have a certain amount of power over students through grades and other information they provide to the institution offering the course. However, most institutions deploy technology without providing adequate--if any--information about what data is captured and how it can be used. I first became aware of this when, years ago, I ran strategic planning sessions using anonymous brainstorming and voting tools. My clients never asked me how the data was stored, but I could easily see who contributed which idea and how everyone voted. I'm glad no manager ever asked me to divulge this data because there was nothing in writing that would have protected my refusal on ethical grounds.

What's the solution to the privacy question? In both the turnpike and education examples, we need explicit policies clearly setting forth what data can be collected for what purposes, who has access, and how long the data is retained. And we must establish a trust mechanism so that users understand all their agreements with data-collecting entities. We must also look for innovative uses of data that benefit everyone, whether adjusting toll booth schedules during peak travel-periods to reduce waits, or developing clearer and more engaging lectures. In the educational realm, such policies, trust models, and improved practices would benefit students, instructors, and institutions—everybody wins.



Comments

  • There are no comments at this time.