Your tip
RadarOnlineRadarOnline
or
Sign in with lockrMail
BREAKING NEWS
Exclusive

Alexa, 'Destroy Civilization!' Why Amazon's Device Is Destroying Privacy As We Know It

internet privacy investigation rf
Source: MEGA

April 22 2021, Updated 6:30 a.m. ET

Link to FacebookShare to XShare to FlipboardShare to Email

Gamblers at the unnamed US casino were more interested in the action on the poker and blackjack tables than what was going on inside the tropical aquarium in the wall. Security ignored it too, until it took down the establishment’s high roller database, swiping 10 gigabytes of sensitive financial information and sending it to a server in Finland. The aquarium was fitted with a smart system that connected it to the internet and allowed staff to remotely feed the fish and adjust the temperature and salinity. 

Hackers were able to exploit a weakness in the software that the system ran on. They used that as a digital trapdoor into the casino’s main network.

Article continues below advertisement
internet privacy investigation
Source: MEGA
Article continues below advertisement

The hack occurred in 2017, in the years since homes and offices have filled with similar connected smart devices. These range from smart speakers to fitness monitors, scales, kettles, vacuum cleaners, coffee machines, door locks, kids’ toys, and lightbulbs. 

According to Statista, this year we are projected to spend $62bn on smart home devices, rising to $88bn by 2025. They are connected to either smartphones, smart speakers, or directly to home wi-fi systems. This vast network is called the Internet of Things or IoT. In this techno-utopia lives are run by algorithms, artificial intelligence (AI), apps, and gadgetry which collects sensitive, personal data and sends it to be stored and analyzed on cloud servers controlled by faceless organizations. 

Article continues below advertisement

Surprisingly, given their place at the heart of our homes, when we buy a set of smart scales or a smart toothbrush, security and privacy are barely a consideration. It also barely factors in the design and manufacture of these devices. Which according to Craig Young, principal security researcher at Tripwire, presents the world with a doomsday scenario-sized problem.

“How bad could it get? Complete loss of internet functionality,” he offers. “In Ukraine and Estonia systems were attacked and citizens had weeks without ATMs, government pension cheques or electricity. These are dangerous situations that really disrupt faith in society because society is so dependent on the internet now. 

Article continues below advertisement

These sorts of events absolutely could happen from hostile agents using weak security in IoT devices to overwhelm the infrastructure of the internet.”

Experts like Young describe each connected smart device as an ‘attack surface’. There were 7.6 billion of them in 2018. By 2030 there will be 24.1bn. That is a lot of vulnerability.

Article continues below advertisement
internet privacy investigation
Source: MEGA
Article continues below advertisement

So, is Alexa your friend or foe?

Generally, concerns about connected consumer devices fall into two categories: privacy and security. Privacy relates to the data devices collect, where that data goes, who sees it, and what it is used for. Security relates to the robustness of the software architecture devices use.

Privacy of data in most devices is at best opaque. 

Manufacturers of goods that collect, store and sell personal data should, under the law in most Western countries, seek consent to handle that data and have protocols in place to protect it. In practice, these permissions are usually buried in terms and conditions documents that take hours to read. Denied consent often also generally affects functionality. Studies show that few people ever read the T&Cs, and click to authorize automatically. Manufacturers such as Amazon, for whom data is a huge revenue stream, know this.

Article continues below advertisement

Indeed, one sector of the IoT market in which privacy concerns are particularly acute are smart speakers, dominated by Amazon Echos and Dots and their integrated digital personal assistant, Alexa. Along with Apple Homepods and Google Nests, these devices sit quietly in millions of homes listening to the intimate details of our lives. 

The common misconception is that they only spring into life when consciously activated with a command word. But that is not the case. Studies by Northeastern University and Imperial College London show that voice assistants embedded in speakers and smartphones, such as Apple’s Siri, Microsoft’s Cortana, Alexa, and Google Assistant are being falsely triggered up to 19 times a day. Anyone with an Alexa-enabled Fitbit watch will understand just how frequently this can happen.

Article continues below advertisement

The researchers found smart speaker devices commonly mistakenly hear wake words such as “Hey Google” when people are talking and therefore record private conversations. 

But the privacy issues go much deeper. Alexa is always on, even when she is not roused by her command words, as David Emm, Principal Security Researcher at Kaspersky reveals. “Until a couple of years ago most people were under the impression Alexa woke up when you said the trigger word and that was that. At various points in the last couple of years, however, it has transpired that is not the case and in fact Alexa is alive. 

Article continues below advertisement

Amazon has shared the fact that it does collect a lot of information which it says it uses purely for improvement of service, nevertheless the fact that information can be picked up has privacy implications.”

Personal digital assistants are designed to learn your habits, your needs and your desires and seamlessly integrate themselves into your life. Companies use the language of ‘personalization’ and ‘intuitive functionality’. 

Article continues below advertisement
internet privacy investigation
Source: MEGA
Article continues below advertisement

When Google launched its first assistant, Google Now, in 2012, company Chief Economist Hal Varian explained that the system should know what you want and tell you before you ask the question. The more of yourself you give to the system, the more you reap the value of the application.

In 2016, Microsoft launched Cortana, and CEO Satya Nadella was equally enthusiastic.

“It knows you deeply. It knows your context, your family, your work. It knows the world. It is unbounded,” she gushed.

But companies are not designing these AIs from an altruistic desire to make your life easier. In the world of technology there is a well-worn adage: “If something is free, you are the product.” The commercial imperative in Alexa, Cortana, and Google Assistant is data. Author of The Age of Surveillance Capitalism, Shoshana Zuboff, describes personal digital assistants as ‘Trojan horses’ which render and monetize lives. They record data and send it to data farms. How it is then used remains unclear. 

Kaspersky’s Emm continues: “The business model of companies like Amazon is increasingly rooted in data. But there are huge questions over how well they look after the vast swathes of data they are hoarding.”

Article continues below advertisement

There are hundreds of anecdotal stories on social media of people convinced that their conversations have been recorded by Alexa, Google Assistant, or Siri and then used to create targeted ads. But experiments to prove this concept have all been inconclusive and often, the examples cited can be traced back to a forgotten Google search or a past Amazon purchase and some advanced intuitive sales algorithms. There is no evidence to suggest data is being used in this way, although the potential is obviously there.

Dr. Garfield Benjamin is a Postdoctoral Researcher at Solent University in the UK.

Article continues below advertisement

He explains: “The approach of these companies so far seems to be to gather as much data as possible and decide if it’s useful later, but that often means they don’t keep track of all the data they have collected and have varying levels of security over how well it is kept, not to mention questions over following privacy regulations. There are specific questions over whether you can even give permission for data to be collected if, say, you have guests over.”

The logic is clear. While there is an expectation that conversations shared on social media platforms are public, surely conversations in your home are private? Unfortunately, not, if you have a smart speaker because even your voice has a commercial value to companies determined to develop the perfect voice capabilities. Big tech is on a global hunt for terabytes of human speech which is then used to train AIs to understand and respond to the commands and queries we give them.

MORE ON:
Amazon

DAILY. BREAKING. CELEBRITY NEWS. ALL FREE.

Article continues below advertisement
internet privacy investigation
Source: MEGA
Article continues below advertisement

Conversations and snippets of talk recorded from smart speakers are sent to third-party contractors who analyze it and use the data to improve functionality. The tech companies insist the process is fully anonymized but undercover investigations and testimony from employees within speech data analytics companies reveal that recordings they transcribe include high levels of intimate conversation and information that easily identify where that conversation comes from.

This type of data use should be made implicit to smart speaker owners, argue privacy advocates. However, one in three smart speaker owners are unaware of how their voice recordings are stored. Amazon, meanwhile, explains that the latest 4th Generation Echo is ‘designed to protect your privacy, built with multiple layers of privacy protection and control, including a Microphone Off button that electronically disconnects the microphones’. And questionable privacy concerns are in no way solely restricted to the smart speaker market.

In 2015, for example, it was discovered that some Samsung smart TVs were recording all speech within their vicinity and sending the records to be transcribed by voice recognition specialists Nuance Communications. Samsung acknowledged this in the TV’s surveillance policy and disclaimed responsibility for third-party policies.  

Article continues below advertisement

In 2017, TV manufacturer Vizio paid $2.2million to the Federal Trade Commission for capturing data about owners’ viewing habits from its sets and selling the information to advertisers and other third parties. 

In the same year My Friend Cayla, an 18-inch internet-connected doll made by Genesis Toys was banned in Germany. The toy used a built-in microphone to record commands which were then processed and uploaded to third-party servers where they were analyzed and stored. The German Federal Network Agency ordered parents to destroy any such dolls in their possession as they constituted a concealed espionage device. 

Innocuous devices become data-harvesting machines when they are fitted with smart technology and become IoT devices. For example, one forensic cyber investigator we spoke to described how a washing machine with an internet-connected app was used to prove an alibi in a criminal case.

The rush to turn anything and everything into a smart device is understandable given that the data broker industry is worth multi-billion dollars a year and every snippet has a price. The financial imperative to monetize data is core to big tech firms and the abjuration of responsibility when that data is passed to a third person is worrying. In general data protection regulations help and give guidelines for what companies can do with data. Transparency should be automatic, but it is rarely a factor.  

As Emms explains: “The issue isn’t whether someone is able to access your information when it’s sitting in situ in a device, the issue is what happens to that information when it is being taken somewhere and stored elsewhere. Practice varies between companies and it’s hard to tell if some are better than others because we only find out about problems when there is a publicized breach.”

Article continues below advertisement

All the interviewees we spoke to in our investigation agreed on one thing. The situation is going to get a whole lot worse, because as more connected devices are linked together in homes, more data is shared. 

Benjamin described this as ‘social function creep’, whereby increasing aspects of our lives are managed through connected devices made by different companies which collect extremely sensitive information like health data. 

“The nature of the companies who manage these devices is changing,” he says. “Do we trust companies like Amazon or Google with our health data? Even if they don’t have our medical records, any searches we do for health-related information can enable them to build up a profile of us.” And those profiles are worth money.

Privacy, then, is a massive concern for individuals. The other side of the IoT coin, security, is where things get scary on a society level, as our investigation discovered.

The IoT, according to the investigators and experts we interviewed, has the potential to be a Pandora’s Box, easily exploitable by criminals, terrorists, and state-sponsored hackers. Design and manufacture of these devices is described as a ‘Wild West’ where insecure, weak systems are commonplace in billions of products that we buy cheaply and instal in our home networks. And we just don’t care.

Article continues below advertisement
internet privacy investigation
Source: MEGA
Article continues below advertisement

Dr. Duncan Hodges is Senior Lecturer in Cyberspace Operations at Cranfield University. 

He says: “Security is not important for buyers or manufacturers. The market differentiator is price point, which means it is hard to make a purchasing decision based on security. How do you know whether to buy smart lightbulb A over smart lightbulb B when there is no information available? We have devices which are incredibly vulnerable, and we put them into the most sensitive locations we have – our homes - and feed them all sorts of sensitive information.”

At the very least, visible devices provide signposts for criminals. A $150 smart doorbell, for example, suggests the homeowner may have other expensive tech inside. Then there is the threat that an opportunist spotting a smart speaker near a window could command it to open a smart lock. More sophisticated opportunists can use laser devices that transduce sound and can be shone onto a smart speaker from a distance to command it. Location data can also be pulled from compromised devices, which then become trackers through criminals can monitor the owner’s movements.

IoT devices such as smart thermostats can be exploited by criminals planning personal attacks. Investigator Dr. Sarah Morris of the Centre for Electronic Warfare, Information, and Cyber Digital Investigation Unit in the UK, explains: “We’ve seen a number of cases where people have had their smartphones stolen specifically so people can access IoT devices within the home. They then use data from devices to work out what their victims are up to and identify when the victim leaves the home, so they can get to them.

Article continues below advertisement
internet privacy investigation
Source: MEGA
Article continues below advertisement

“We see it in divorce cases too, where someone exploits the other party’s credentials from outside the home to utilize the tech. We’ve seen IT technicians use access to laptops for stalking purposes.”

This is just the beginning. Criminals have yet to catch up with the burgeoning IoT market. Security experts predict an ensuing ‘crime harvest’ as more felons realize the vulnerabilities insecure devices expose. 

And there is no end of opportunity because so many of the gadgets that continue to flood the market are made overseas by manufacturers who produce identical products for multiple sellers that are simply rebadged. 

As Hodges explains: “If you go on Amazon and look for smart light bulbs, you have to scroll through a lot of products before you get to a manufacturer you’ve heard of. A lot of these cheap devices have systemic vulnerabilities like backdoors and hard-coded passwords because they’ve all come from the same factory but are just branded differently.”

Article continues below advertisement

There is evidence that criminals are already monitoring chatter on forums where vulnerabilities are discussed. And even if the software inside a device is patched up with firmware updates, data travelling to and from devices is still at risk and people will still make poor purchasing decisions and leave gadgets open to exploitation by poor cyber hygiene practices, such as leaving default passwords on them. 

As Morris laments: “It will take horror stories before people change their behaviours.”

And this is where Craig Young’s doomsday scenario enters the story. An attack on society launched by billions of compromised devices, working in hive mind unison to send a major power back to the dark ages sounds like the plot for the latest Marvel movie. Except it’s already happened. 

On October 12, 2016, the internet on the East Coast of the US went down as the result of a cyber-attack. Authorities initially assumed it was the work of a hostile nation state. The failure was caused by something called a distributed denial of service (DDoS) attack. These happen by overwhelming a system with requests for data, for example, sending a web server so many requests to serve a page that it crashes under the demand. DDoS attacks are launched by botnets, which are groups of internet-connected computers, or bots, under the control of an outside party. Devices become bots when they are infected by malware that allows an assailant to control aspects of their functionality without the owner knowing. Botnets can consist of millions of infected devices and are traded by criminals on the dark web. For a few hundred dollars you can take control of an army of compromised devices and crash your adversary’s website. 

Article continues below advertisement
Article continues below advertisement

The 2016 attack was launched by a botnet called Mirai. It was specifically developed to exploit security weaknesses in IoT devices, of the type our interviewees warn about. 

Mirai scanned the internet for access points into insecure devices and attempted to log in to them using over 60 common default username and password combinations. It was simple and effective. The botnet grew to include around 560,000 infected gadgets. 

Unlike many botnets that are created by Eastern European organized crime syndicates or nation-state intelligence services, Mirai was the work of a Rutgers undergraduate named Paras Jha. Initially he used his army of compromised home gadgets to launch DDoS attacks against his university and then against servers that hosted Minecraft games.

He posted the code of the botnet online, allowing copycats to use it, which led to the big attack on October 12. 

Jha was eventually caught and along with associates Dalton Norman and Josiah White pleaded guilty to conspiracy charges. Each was sentenced to five years' probation and ordered to pay $127,000 in restitution. They were also required to surrender crypto currency seized by investigators. In a surprise twist each was also ordered to serve community service working with the FBI on cybercrime and cybersecurity matters. 

Having been released into the wilds of the internet wilderness, Mirai now has numerous spinoffs that continue to cause problems.

The episode perfectly illustrates the potential catastrophic consequences of allowing billions of vulnerable devices into our homes and onto our home wi-fi networks.

As Hodges explains: “If you owned one of these devices there was nothing you could do to defend against attack.”

And even the top end established brands have vulnerabilities because they allow third parties to create apps to work on their products. Amazon, for example, allows third parties to make apps for Alexa. It calls these ‘skills’. Apple has the same system for app developers. It is feasible that a tech-savvy third party with nefarious intent could capitalize on this for criminal purposes. Researchers have already demonstrated that they can make Alexa skills that effectively trick the owner into thinking the speaker is inactive when it is listening.

Article continues below advertisement
internet privacy investigation
Source: MEGA

And therein lies possibly the biggest pitfall in the IoT market. Many of these devices incorporate software and code of obscure origin. They are made by small organisations that have raced to get ideas to market without necessarily taking the time to think about secure architecture for their products.

As Young reveals: “The software and drivers for these devices and apps are a big hodge-podge from a range of sources.”

It is not unfeasible to imagine malicious agents could infiltrate this market in code and algorithms to embed trojan viruses in programmes that eventually become incorporated in millions of consumer products. Governments are rightly concerned about technology from firms such as China’s Huawei being incorporated into the nation’s 5G network but pay scant regard to the code running the nation’s connected doorbells, smart speakers and baby monitors.

Young continues: “Typically when I’m analysing a commercial hardware product, I’ll find it includes open source and proprietary technologies, some come from the manufacturer, more often than not components come from other third parties. You generally have a lot of cooks in the mix. The industry is not regulated. It’s a wild west.

“A popular device with a large instal base and insecure software that reports to cloud infrastructure for software updates is an attractive target for criminal operators or a national level adversary.”

And the consequences could be catastrophic.

“If they were the right devices you could shut down the internet,” Young predicts.

It’s a sobering thought, next time you ask Alexa to play your end of days playlist.

More From Radar Online

    Opt-out of personalized ads

    © Copyright 2024 RADAR ONLINE™️. A DIVISION OF MYSTIFY ENTERTAINMENT NETWORK INC. RADAR ONLINE is a registered trademark. All rights reserved. Registration on or use of this site constitutes acceptance of our Terms of Service, Privacy Policy and Cookies Policy. People may receive compensation for some links to products and services. Offers may be subject to change without notice.