SURVEILLANCE IMPLICATIONS FOR OLDER ADULTS AGING IN PLACE WITH SMART HOME TECHNOLOGY
AUTHOR(S) & CREDENTIALS: Jessica Percy Campbell, PhD Political Science at the University of Victoria and Cassy Hemphill, Communications and Engagement Coordinator at the APPTA Hub and AGE-WELL NCE.
AFFILIATED INSTITUTION(S): AGE-WELL National Innovation Hub: APPTA
CAN YOU TELL US A BIT ABOUT YOURSELF?
Jessica Percy Campbell is a recent PhD. graduate from the Political Science department from the University of Victoria. In April 2023, Jessica successfully defended her dissertation on the privacy and surveillance implications for older adults who are aging in place with Google and Amazon smart speakers. Along with being a recent PhD graduate, Jessica is a sessional instructor in the Political Science department at the University of Victoria where she teaches Politics of Surveillance as well as a (remote) clinical research analyst for the University Health Network.
CAN YOU TELL ME A BIT MORE ABOUT YOUR PROJECT?
Aging in Place with Google and Amazon Smart Speakers: Privacy and Surveillance Implications for Older Adults, explores some of the potential ethical issues involved with using commercial grade smart home technologies for health/aging purposes. As we move further into the realm of smart home technologies for aging, Jessica questioned: how can we best safeguard privacy and autonomy, assess/protect against algorithmic discrimination, and limit user data commodification.
WHAT MOTIVATED OR INSPIRED YOU TO BEGIN WORKING ON THIS PROJECT AND CAN YOU DESCRIBE THE MAIN ISSUE OR CHALLENGE YOU INTENDED TO ADDRESS WITH IT?
What began as a concern for her grandmother quickly turned into a passion. When Jessica’s grandmother started living alone for the first time many suggested the installation of smart home technologies or wearable technologies such as an emergency call button is case of a fall and smart security cameras. Often, older adults are encouraged by marketers to use voice-activated digital assistants to help them live in their homes for longer by acting as digital companions, mobility aids, appointment reminders, and more. Although there are plenty of benefits to the devices, the suggestion of implementing these into her grandmother’s household led Jessica to search into smart homes for older adults. To her surprise, Jessica found that smart-home technology brands such as Google and Amazon were often marketing their speakers towards older people. Although helpful in the ways listed above, smart-home technologies also pose the concern of infringing on older adults’ autonomy, along with various privacy and surveillance implications.
WHY DO YOU THINK RESEARCH IN THIS AREA IS IMPORTANT?
As our aging population challenges strained health and senior care systems, smart home technology is positioned to alleviate some of the pressure. At the same time, under surveillance capitalism, developers and marketers stand to profit from collecting massive amounts of user data to predict, modify, and control behavior through targeted advertisements. While Canadian private sector privacy legislation hinges on meaningful user consent for data collection, obtaining such consent can prove difficult for smart speaker users in general, especially for older adults with limited experience with technology.
WHY DO YOU THINK MUST OCCUR IN ORDER TO REDUCE PRIVACY AND SURVIELLANCE IMPLICATIONS OF OLDER ADULTS IN REGARD TO SMART-HOME TECHNOLOGY?
Jessica believes that a combination of user-centric design and education initiatives are required. Jessica elaborated on these areas of concern suggesting the following approaches respectively: the inclusion of users of all ages/technological abilities in the design processes of the products they use and publicly funded community-based initiatives to help older people learn how to protect their privacy, on online platforms but also with their smart home devices.
Along with user-centric design and education initiatives, Jessica also highlighted the importance of Linnet Taylor’s 3 pillars of data justice as an approach to regulation. This would include (1) the right to visibility/invisibility (e.g., be represented in certain databases when it is beneficial such as healthcare databases, while also maintaining the right exclude oneself from others, such as commercial databases), (2) disengagement (e.g., the right to be given meaningful alternatives when technological solutions are not the preferred options) and anti-discrimination (e.g. the right to assess and challenge the potential for bias in AI outputs).