How smart home tech could perpetuate discrimination and racial profiling
Will smart doorbells ring in a new surveillance state?
Would you trust an algorithm to point out suspicious people in your neighbourhood?
That's where new technologies from Amazon and Google appear to be headed, as the tech giants establish their presence in the home security market. But according to some experts, civilian surveillance could have real impacts on privacy and racial profiling.
Google's "Nest Hello" is a smart doorbell that stores video constantly and uses facial recognition technology. Amazon has released the "Alexa Guard" app in the U.S. that will use existing Alexa devices to listen for potential break-ins.
"Home security is the number one reason people buy smart home products," Stacey Higginbotham, host of the Internet of Things podcast, told Spark host Nora Young.
She said that the home security aspect is also a way for companies like Google and Amazon to get in (or on) the door of homes where owners don't see a need for a smart speaker or similar device.
"It's a very different thing to stick a camera that's facing outward on your house, and use that to track what's going on in your neighborhood. That is much less concerning to most consumers, but then it does open up a whole lot of new services for these providers," she said.
Reproducing discrimination
But it's those outward-looking cameras—and what they do with what they see—that has Chris Gilliard concerned. He's a professor at Macomb Community College in Michigan, and his research focuses on how data mining and algorithmic decision making can reproduce discrimination.
Amazon's smart doorbell, called Ring, has a companion app called Neighbors. It lets users share and comment on security concerns in their area and even share video footage which often comes from the doorbell's camera. There have been reports that police departments in a number of U.S. cities have been offering free or discounted Ring doorbells to communities.
"I'm concerned about the move to a 'more surveillance' society. I don't think that more cameras equals more safety; In fact I think the inverse is true, that for marginalized communities, more cameras often mean less safety," Gilliard said.
He pointed to examples where people of colour have been targeted by residents or police simply for being in an area that is predominantly white.
"I think there is a very significant difference between a device or a system that alerts someone when an entrance has been breached—you know when the door's open when it's not supposed to be open... to a system that is constantly watching and listening, picking up all kinds of signals."