“The Privacy Implications of Virtual Personal Assistants” (BusinessWest/August 7, 2018)

Everyone is now familiar with Alexa, Siri, or Google Assistant, virtual personal assistants (VPAs) marketed by Amazon, Apple, and Google, respectively.

VPAs contain voice-activated applications that promise users a chipper, responsive intelligence for dealing with everyday tasks like phone calls, calendar reminders, coffee orders, streaming entertainment, and list making. In the courtroom, however, law enforcement, digital privacy activists, technology companies — and, yes, Alexa herself — have been exploring the First and Fourth Amendment implications of VPAs’ eclectic résumé.

While VPAs are working for their users, they are also working for Google, Amazon, Apple, and other companies interested in consumers’ habits, interests, and data. Alexa, for example, is regularly ‘listening’ and scanning for her ‘wake word.’ When she hears it, she records the vocal input and her response, then uploads that data to a server in the cloud, effectively reporting it up the chain to her digital overlords at Amazon.

According to the Alexa terms of use, Amazon retains these ‘Alexa interactions,’ which include music playlists and shopping lists, in addition to ‘vocal input,’ for an unspecified amount of time. This is allegedly to provide, personalize, and improve those services, but it is also undoubtedly to provide those technology companies with a valuable, veritable harvest of data.

Read more:https://businesswest.com/blog/the-privacy-implications-of-virtual-personal-assistants/

“The Privacy Implications of Virtual Personal Assistants” (BusinessWest/August 7, 2018)
Related People

Lauren C. Ostberg

Related Services

Cybersecurity