Healthcare workers accuse Alexa of possibly recording protected info

In a class action filed this week, healthcare workers alleged that their Alexa-enabled devices may have recorded their conversations – including potentially protected information.

Some of the plaintiffs, who include a substance abuse counselor and a healthcare customer service representative, say they work with HIPAA-protected information. Others say they have private conversations with patients.

All four raise concerns that Alexa may have captured sensitive information without their intent.  

“Amazon’s conduct in surreptitiously recording consumers has violated federal and state wiretapping, privacy, and consumer protection laws,” alleged the lawsuit, which was filed in the Western District of Washington federal court. Amazon did not respond to requests for comment.

WHY IT MATTERS  

The plaintiffs’ complaints are twofold: They say that users may unintentionally awaken Amazon Alexa-enabled devices and that Amazon uses human intelligence and AI to listen to, interpret and evaluate these records for its own business purposes.   

“Despite Alexa’s built-in listening and recording functionalities, Amazon failed to disclose that it makes, stores, analyzes and uses recordings of these interactions at the time plaintiffs’ and putative class members’ purchased their Alexa devices,” read the lawsuit.

The four plaintiffs, all of whom work in the healthcare industry in some capacity, say they either stopped using Alexa devices or purchased newer models with a mute function out of concern that their conversations may be unintentionally recorded, stored and listened to.

The suit cites studies, such as one from Northeastern University, that have found smart speakers are activated by words other than “wake words.”  

For Amazon devices, researchers found activations with sentences including “I care about,” “I messed up,” and “I got something,” as well as “head coach,” “pickle” and “I’m sorry.”

Some of the activations, researchers found, were long enough to record potentially sensitive audio.

In 2019, Amazon announced an “ongoing effort” to ensure that transcripts would be deleted from Alexa’s servers after customers deleted voice recordings. Amazon executives also noted in 2020 that customers can “opt out” of human annotation of transcribed data and that they can automatically delete voice recordings older than three or 18 months.

“By then, Amazon’s analysts may have already listened to the recordings before that ability was enabled,” argues the lawsuit.  

THE LARGER TREND  

Amazon has made inroads over the past few years when it comes to implementing voice-enabled features aimed at addressing medical needs.

But some users still express skepticism about using voice technology and AI for health issues.  

And in December 2019, privacy organizations in the United Kingdom raised concerns about a deal that allowed Amazon to use NHS data.  

ON THE RECORD    

“Plaintiffs expected [their] Alexa Device to only ‘listen’ when prompted by the use of the ‘wake word,’ and did not expect that recordings would be intercepted, stored, or evaluated by Amazon,” read the lawsuit.   

“Had Plaintiffs known that Amazon permanently stored and listed [sic] to recordings made by its Alexa device, Plaintiffs would have either not purchased the Alexa Device or demanded to pay less,” it continued.

 

Kat Jercich is senior editor of Healthcare IT News.
Twitter: @kjercich
Email: [email protected]
Healthcare IT News is a HIMSS Media publication.

Source: Read Full Article