Think twice before sharing medical secrets with Amazon’s Alexa, Florida’s CFO warns

  • TECH
  • Thursday, 18 Apr 2019

An Alexa-enabled Echo Plus. (Amazon)

Amid reports that Amazon employs thousands of people to listen to customers talk to their virtual personal assistant Alexa, Florida’s chief financial officer is warning residents to be careful what they say about their private health matters. 

He also revealed steps Alexa owners might not realise they can take to protect themselves. 

Bloomberg News last week reported that Amazon employees around the globe are listening to voice recordings captured in the homes of owners of its popular line of personal speaker, Echo. The recordings are transcribed and fed back into the device’s operating software to help it better understand human speech. 

Occasionally workers hear possible evidence of crimes, including sexual assaults, the report stated. But Amazon says its employees have no way to track down the identities of the voices they hear. 

Meanwhile, Amazon has announced that it’s developing ways that patients can one day use the Echo to communicate securely with health care providers and pharmacists. 

Florida CFO Jimmy Patronis says customers should be wary about how secure their information actually is with Alexa in their house. 

"How comfortable are you with a complete stranger listening to audio files from your home? That’s a reality with Amazon’s Alexa technology," Patronis was quoted in a news release April 15 as saying. "Although it’s built to make our lives easier and more productive, there are real privacy risks." 

Alexa owners should not only beef up privacy protections, they should also weigh the risks of sharing personal information, especially prescriptions or other medical information, Patronis said. 

Because of the 2017 data breach at the Equifax credit monitoring company, 143 million people now face the potential of lifelong threats of identity theft, the news release said. Added Patronis: "It’s only a matter of time before voice technology suffers a breach." 

Patronis offered three steps Alexa owners can take to protect their privacy: 

Turn off the microphone and camera when you aren’t using your Alexa device. Alexa products capture all voice commands and other sounds in the room. If you don’t intend to use the device, turn your mic off to ensure it can’t record things being said. For newer devices with cameras, turn off that function when you know it won’t be used. 

Delete old Alexa recordings. All of your recordings remain on Amazon’s cloud until their owners delete them, Patronis said. Some may include private information. You can listen to and delete them by going to Settings > History in the Alexa app. You can also utilise the dashboard available at 

Limit or disable Drop In permissions immediately. The Drop In feature on Alexa devices is designed to create an intercom system with another device. Be very careful when allowing any Drop In permissions to your device as it allows others to access your mic and speaker, whether the user is in another room or another city. – Sun Sentinel/Tribune News Service

Article type: metered
User Type: anonymous web
User Status:
Campaign ID: 1
Cxense type: free
User access status: 3
Join our Telegram channel to get our Evening Alerts and breaking news highlights

Amazon; Alexa


Next In Tech News

India antitrust probe finds Google abused Android dominance, report shows
Chinese version of TikTok limits use of app by those under 14
What to expect before buying an eScooter
Work together or fail: 'Operation: Tango' is a game built for two
Millions of gamers on HP computers left vulnerable by security flaw
Workplace meetings hit the road as Microsoft develops Teams for cars
You had one job: Google's alarm fails countless users after update
U.S. probes possible insider trading at Binance - Bloomberg News
Barra: GM will make 'substantial shifts' in supply chain over chips
Verizon sweetens subsidies on iPhones to match competition

Stories You'll Enjoy