Endpoint Security , Governance & Risk Management , Internet of Things Security

Healthcare Workers Allege Amazon Alexa Violates Privacy

Lawsuit Highlights Virtual Assistant Device Risks
Healthcare Workers Allege Amazon Alexa Violates Privacy
A lawsuit alleges Amazon Alexa device recordings violate privacy and wiretapping laws.

Amazon's Alexa virtual assistant device and applications are unlawfully recording and storing highly sensitive and private conversations, including discussions of patient information, that were not meant to be recorded, four healthcare workers allege in a lawsuit seeking class action status.

See Also: Advanced Cyberthreat Intelligence Against The 2018 Threat Landscape

The lawsuit also alleges at the time the products were purchased by customers, Amazon failed to disclose that the product "makes, stores, analyzes and uses recordings of these interactions … to listen to, interpret, and evaluate these records … for its own business purposes."

Patient Information

The plaintiffs - healthcare sector workers who purchased the devices for use in their homes - allege that they noticed their Alexa devices would activate even when they, or other people in their homes, had not issued any commands.

One of the plaintiffs works in the psychiatric field "and communicates and works with HIPAA-protected information that Alexa may have captured without [the plaintiff's] intent," according to the lawsuit. Another plaintiff is a substance abuse counselor.

The lawsuit notes that each of the four healthcare workers stopped using their Alexa devices when they learned that "Amazon may be recording, storing and listening to their conversations" about patients without their intent.

"Amazon has a repository of tens of millions of conversations that it maintains and listens to and that is available for Amazon’s use," the lawsuit alleges.

'Millions' of Interactions

The lawsuit alleges that Amazon represents that its Alexa devices work by listening for a specific "wake word" such as "Alexa," that, once spoken, triggers the Alexa device or service to listen to users and respond to user commands.

"What it does not represent, however, is that when Alexa hears any word or phrase it identifies - correctly or not - as a 'wake word,' Amazon initiates a process to record the surrounding audio and create and permanently store the recording," the complaint alleges.

"The recordings may capture the user’s voice, and the voice of any individual near the Alexa device, as well as commands, other sounds and identifying information like usage data, location data, and personal information."

Amazon has "millions of recorded interactions between users and its Alexa Devices," the lawsuit contends.

"Worse, not all those recordings contain conversations that consumers intend for an Alexa device to hear. In fact, because Alexa devices are trained to start recording when the device believes it heard a 'wake word,' user conversations may be recorded when the Alexa device misinterprets the user’s speech and incorrectly identifies a 'wake word' that was not said. Thus, Alexa devices may be recording private conversations that users never intended Alexa, or anyone else, to hear," the complaint alleges.

Alexa devices record activity even, in some cases, when not intentionally addressed with a 'wake word' and send the recordings to Amazon, the lawsuit alleges. Amazon artificial intelligence, employees and third-party contractors "freely listen to and analyze their contents and make use thereof for Amazon’s business purposes," the complaint alleges.

At the time consumers purchased their Alexa devices, "Amazon failed to disclose its widespread creation, storage and use of those records for its own business purposes that extend beyond improving or personalizing Alexa’s services," the lawsuit alleges. "Instead, Amazon represented that Alexa sent audio to the cloud for the sole purpose of generating an appropriate response."

Amazon's "surreptitiously recording" consumers has violated federal and state wiretapping, privacy and consumer protection laws, the lawsuit alleges. "The only way to stop Amazon from making these recordings is to mute the Alexa Device’s microphone or unplug the device, thereby defeating its utility."

Seeking Damages, Injunction

The lawsuit also claims that although Alexa device users may request that Amazon delete all of the information obtained from a smart speaker device, "a user may not stop Amazon from collecting the recordings in the first place. Amazon did not provide users with the ability to delete records until 2019 and, by then, Amazon’s analysts may have already listened to the recordings before that ability was enabled."

The lawsuit is seeking to represent all adult U.S. citizens who have owned and used an Alexa device or downloaded and used the Alexa app since 2017.

"Given that over 200 million Alexa devices have been sold worldwide, with approximately half of those in the United States, the number of class members is likely in the tens of millions," the suit states.

The lawsuit seeks damages, "an order declaring that the acts and practices of Amazon violate the state and federal privacy laws," and "a permanent injunction" preventing Amazon from "continuing to harm plaintiffs and members of the Class and the public."

Amazon Statement

Amazon declined Information Security Media Group's request for comment on the lawsuit.

But in a statement, it says: "Alexa and Echo devices are designed to only detect your chosen wake word (Alexa, Amazon, Computer, or Echo). No audio is stored or sent to the cloud unless the device detects the wake word - or Alexa is activated by pressing a button. Customers have several options to manage their recordings, including the option to not have their recordings saved at all and the ability to automatically delete recordings on an ongoing three- or 18-month basis."

Uphill Climb

Some legal experts say the lawsuit's plaintiffs face an uphill battle.

"The class action lawsuit charges Amazon with violations of federal and Washington state wiretap laws and Washington state consumer protection law," says privacy and security attorney Paul Hales of the law firm Hales Law Group. "Eye-catching claims that Alexa might have recorded HIPAA-protected health information are irrelevant at this stage. First plaintiffs must convince the court they even have standing to bring this class action by overcoming an onslaught of pretrial motions the defense is certain to raise," he says.

Claims of unsuspected HIPAA breaches by home-based healthcare workers "highlight a dangerous problem in this COVID-19 era," he says. "Telehealth and telecommuting have skyrocketed to cope with the pandemic. Healthcare providers working from home must protect health information just as when they work in a clinic or office."

While the healthcare industry is making progress in better securing information, "remote workplace privacy risk levels remain high," he says. "As a result of this lawsuit, cautious compliance officials may add ‘turn off Alexa’ to remote worker privacy checklists."

Risky Move?

Technology attorney Steven Teppler, a partner of the law firm Mandelbaum Salsburg P.C., agrees that the plaintiffs face hurdles in their lawsuit, saying, "both sides will have grist for the mill."

"First, the class definition is extremely broad," he notes. "The argument made by plaintiffs is … compelling. But seriously, why on earth would a practicing physician have an active Alexa-based smart device anywhere near where confidential information is being exchanged?" he asks.

"The physician may have a good argument for misrepresentation by Amazon, but there is an equally coherent argument that the physician - or anyone handling confidential/sensitive information, including attorneys … failed to implement commercially reasonable security."

Teppler adds: "There are privacy issues involved with any 'smart' device or service, because it involves the exchange of information. It should come as no surprise that the 'smarter' the device or service, the more information it will collect, and likely seek to further monetize."


About the Author

Marianne Kolbasuk McGee

Marianne Kolbasuk McGee

Executive Editor, HealthcareInfoSecurity, ISMG

McGee is executive editor of Information Security Media Group's HealthcareInfoSecurity.com media site. She has about 30 years of IT journalism experience, with a focus on healthcare information technology issues for more than 15 years. Before joining ISMG in 2012, she was a reporter at InformationWeek magazine and news site and played a lead role in the launch of InformationWeek's healthcare IT media site.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing inforisktoday.in, you agree to our use of cookies.