There’s probably a camera watching you read this right now.
It might be above your head, on your desk, or in your pocket — in an unprecedented era of digital surveillance, cameras have become ubiquitous. So one would think students might have brushed off the revelation in February that the new M&M vending machines on campus contained cameras that could detect basic demographic attributes of its users.
Instead, students expressed their surprise and concern via the r/uwaterloo subreddit, and community members sent dozens of complaints to the Ontario Privacy Commissioner, with the story gaining international recognition. So, what gives?
Several students said their reactions were largely driven by the fact that the inclusion of the cameras were not proactively disclosed to the university and to students.
“I think if there’s any kind of surveillance technology in something, you should at least be told, ‘Hey, there’s a camera,’” said Vanessa Boyce, a first-year physics and astronomy student. “Knowing companies, they might try to disguise it as, ‘Oh, it’s an anti-theft camera,’… but yeah, you should be informed if there is a camera. And if that camera is turned on and pointed at you.”
They added that while they’re used to security cameras and “be[ing] in the background of stuff, which probably also says something about how normalized surveillance is,” the singular focus on individuals is what bothered them about the use of cameras in the machines.
Dev Patel, a first-year accounting and finance student, said that the lack of notice about the cameras violated an implicit social contract that big tech companies like Apple have gotten us used to with regards to privacy and surveillance.
“For example, our phones have cameras on us and they take regular pictures of our face, right? But that information is already disclosed on the terms and conditions of Apple,” he said, referring to the discovery several years ago that iPhones emit 30,000 infrared dots in a known pattern whenever a face is detected as part of its TrueDepth IR camera. The information helps generate a 3D map of the user’s face, aiding the phone’s facial recognition abilities for features like FaceID.
Other community members voiced a similar sense of being accustomed to detection technology. “Facial analysis technology in particular is becoming so deeply integrated into everything we do that that initial surprise is also followed by a ‘Of course, of course there’s some sort of tracking device in these machines,’” said Krystle Shore, a postdoctoral researcher in the department of sociology and legal studies whose research focuses on the use of surveillance technology to solve social problems.
In a previous comment on the matter, Invenda, the manufacturer of the machines, told Imprint that its systems “adhere rigorously to GDPR regulations and refrain expressly from managing, retaining, or processing any personally identifiable information,” referring to the General Data Protection Regulations law put in place by the European Union (E.U.) which applies to any company that sells products or services in the E.U., including Invenda.
They did not explicitly address why students were not proactively made aware of the presence of detection technology in the machines.
Shore offered another potential reason behind the uproar, related to the growing collective understanding of the harmful implications of biometric surveillance practices and corporations “collecting and using our information for their own benefit.”
“Amidst all these concerns, about what information is being collected without our knowledge and consent, and where is it going, how is it being used, who’s benefiting from it, what might happen with it in the future… I think that, along with the sort of unexpected nature of this particular practice, left people feeling really uneasy,” she said, adding that the concerns “are especially salient, of course, for poor communities or under housed communities and communities of color who experienced the disproportionate harms of these surveillance practices.”
However, though students acknowledged the problematic execution of the cameras, the fact of their usage didn’t present as much of a concern to them.
“I don’t expect any privacy when you’re out and about,” Patel said. “Because for one, you have cameras all around campus. They’re not just to spy on you. They’re for safety.”
Other students echoed the justification of increased surveillance due to safety reasons.
Puja Thaker, a second-year planning student, said that surveillance is justified particularly when in crowded areas, which can sometimes shroud things like harassment to women. “You should be able to feel some sort of comfort,” she said. “And coming from an urban planning student, safety is one of the number one priorities in city planning. And especially on campus settings, a lot of students need that level of safety and comfort to continue their education.”
But does surveillance always equate to an increased sense of safety? Shore says not necessarily.
She cited the aftermath of the stabbing attacks in Hagey Hall last June, which sent the professor and two students of a gender studies class to the hospital, as an example of the assumption that more surveillance makes spaces safer. “We know that more surveillance does not necessarily equate with more safety,” she said. “Actually what it can do is sort of create a culture of fear and distrust among, you know, the community where the surveillance is being implemented.”
Surveillance can contribute to harmful practises and disproportionate focuses on marginalized communities. For example, though increases in closed-circuit television (CCTV) and police body cameras have been suggested as methods of combating police brutality, CCTV footage can misidentify individuals, and research on the effectiveness of features like body cameras is mixed. Both types of surveillance also target individual actions rather than structural inequalities.
While Shore herself thinks we should be able to give consent to surveillance practices, “that’s not the world we live in these days.” To properly respond to such an issue, it takes a balance between lobbying to various levels of government and community work.
“Often people think the legislation needs to change at a big level, like the federal or provincial level, but we can lobby for municipal legislation and regulation. We can push for institutional regulation… here at the university,” Shore said.
She added that various organizations, including Design Justice Network, Black Lives Matter, the Electronic Frontier Foundation, and Canada’s Civil Liberties Association, are already involved in calling for action from technology companies and organizations that use surveillance. This action includes changing the ways we think about business practices and technology design, and the communities that are affected by those practices.
“We need to bring in the communities who are affected by these practices when we’re designing surveillance and when we’re implementing it, get their perspective, centre their perspective,” Shore said.
She praised students who took note of the cameras for being vocal in a time where it’s easy to become desensitized or overwhelmed by such practices. “When I’m teaching about this topic, I push my students to really just take note of the surveillance that’s going on around them,” Shore said. She hopes that rather than let it become “background noise [in] our everyday life,” that her students question if such surveillance is necessary and who really benefits from it, and then “certainly join the fight to make some changes, so that it’s not quite so pervasive and harmful.”
A university spokesperson said that UW is “committed to continuous improvement by regularly reviewing and refining procurement procedures to enhance effectiveness and address any identified issues like the technology that was installed in the vending machines.”