Smart speakers – like Google Home and Amazon Echo – have changed the way our homes/offices function. Indeed, these voice-activated speakers execute simple commands provided by voice or smartphone application. With nothing more than a question, one can direct the smart speaker to, among other things, play music and podcasts, provide a weather forecast, or set an alarm.
The technology is straight forward. For example, Google Assistant—the voice-activated software associated with the Google Home—“listens” for a hotword.* When the smart speaker hears the hotword, the device switches to “active listening” mode, records and analyzes the provided audio, and executes the command provided. While the audio is used to effectuate the user’s commands, the recorded data is also used to (1) target personalized advertising to end-users, and (2) improve the voice recognition capabilities of the device.
Given these competing uses, it should come as no surprise that privacy concerns may be implicated by the use of smart speakers. Recently, certain privacy concerns received attention in a class action lawsuit filed in California.** At issue in that lawsuit was the smart speakers’ recording and storing of users’ private conversations without the knowledge or consent of the specific user, potentially violating privacy rights and expectations.***
Smart speakers, while fun and convenient, are not without risk. Users should consider the following:
Devices with Cameras – Certain smart speakers include cameras that can be used to video-conference or chat with friends and family. Just as the device can listen without one’s knowledge or consent, with a camera, it may secure visual recordings as well. And so, to avoid unintended data collection, it may be a good practice to deactivate the camera when not in use.
Vulnerability to Hackers – Like any other computer/smart device, smart speakers and their software are susceptible to hackers. Deactivating the camera and microphone when not in use is good practice to minimize vulnerability to hackers. For those who desire additional safeguards, consider covering the camera or unplugging the device when not in use. Although these protective measures take away from the convenience of the technology, it may be a precaution worth considering.
Personal Information – Some smart speakers and voice activated software can be used to pay bills, transfer money, check balances, etc. However, this functionality requires users to provide the smart speaker access to their confidential banking information. Due to the various privacy issues mentioned above and discussed in the California class action, using the speakers for banking and similar tasks introduced an additional layer of potential risk because now, your financial information (not just your kitchen conversations) may be accessible to hackers who infiltrate the software on the device.
Many of us welcome new technology to make our daily lives more convenient. In seeking this convenience, however, we must also be mindful of the attendant privacy risks. And so, before you enter your kitchen and ask, “Hey Google, what time is it?” understand people may be watching and listening.
* The “hotwords” or “wake words” that call Google Assistant to attention are commonly “Okay Google” or “Hey Google.” And, for the Amazon Echo, the wake word is usually, “Alexa.”
** In In re Google Assistant Privacy Litigation, the class, consisting of purchasers of any Google Assistant-enabled device, brought various privacy violation claims under both federal and California state law.
*** For a full recitation of the facts at issue in the lawsuit, consult In re Google Assistant Privacy Litigation, Briefly stated, however, the class brought suit because the various smart speakers were recording and storing audio when it “heard” what it thought was (but actually was not) a “hotword” or “wake word.” And so, unintended activations, known as “false accepts,” were resulting in recordings and transcripts that the end-user never intended to be recorded. And, rather than delete the recordings and transcripts generated by “false accept” activations, Google used them for its own purposes, as if it were an authorized recording. Additionally, plaintiffs alleged that many of the recordings obtained by Google contained conversations of children who could not consent to being recorded. Amazon has been the subject of similar lawsuits concerning its Amazon Echo device (see https://nypost.com/2019/06/13/amazon-sued-for-recording-childrens-voices-via-alexa/).