The Dark Side Of Alexa, Google And Other Personal Digital Assistants

A few years ago, digital assistants such as Amazon Alexa, Apple Siri, and Google Assistant were looking forward. The future is here, and this future is an integral and growing part everywhere.

Digital assistants can be found in your office, home, car, hotel, phone or elsewhere. They track and collect data in real-time and have the ability to extract information from various sources, such as smart devices and cloud solutions, and put the information in the context of artificial intelligence to learn about this circumstance. Although we have come a long way in the development and implementation of artificial intelligence technologies, much work remains to be done in this area.

Most of the information that these online helpers collect and use includes specific identifiable and potentially confidential information. Can Alexa or other digital assistants violate the privacy and security of our data? Maybe. These digital assistants have a dark side.

My experience is data privacy, data management, and artificial intelligence. I was previously an Information and Privacy Officer at the Ontario Office of Ombudsman for Information and Privacy.

Welcome support

Imagine the following situation. You are waiting for some other guests. Your first guest, as well as your outdoor CCTV camera, arrives on your balcony to ascend to your residence. A friendly voice greets you and opens the door. Once here, your electronic assistant explains to the guest that you are on the way and will be returning home soon.

Through your home audio system, your digital assistant plays various favorite songs for your guests (from your Spotify friends network). Your digital assistant asks your guest if pumpkin spices are still your favorite coffee aroma, or do you prefer others: French or Colombian vanilla. Soon, his guest picks up coffee from an electronic coffee maker. Welcome tasks are completed, your electronic assistant is silent, and while you wait, your guest makes several phone calls.

It’s interesting how a digital assistant can accurately and independently check his guest, choose his favorite songs, remember the taste of his favorite coffee and control smart appliances.
Community Assistants

But are you concerned about the behavior of your digital assistant?

Digital assistants can record our conversations, photos and many other elements of confidential confidential information, such as a website, through our phones. They use our machine learning knowledge to improve themselves. Your program is designed and supported by companies that are constantly considering new ways to collect and use our information.

As with other computer applications, the main problem of these digital assistants is that they are prone to technical and practical malfunctions. The digital assistant can also be hacked remotely, which leads to violations of consumer privacy.
For example, a couple from Oregon should disconnect their Alexa device, the Amazon Virtual Assistant, where their private conversation is recorded and sent to one of their friends on their contact list.

In the second case, a German accidentally received 1700 Alexa audio files belonging to the completely unknown. The records showed the person’s name, customs, duties, and other confidential information.
Pronoun of conscience

The growing popularity and availability of personal digital assistants have widened the perceived digital divide. An interesting irony is that people who perceive and feel privacy concerns tend to limit the use of electronic tools, while users who are less likely to protect their privacy widely integrate private assistants into their digital lives.

Digital assistants continuously collect data or expect to “activate” or activate a word. It does not limit data collection to its owners or authorized user data. Digital assistants can collect and process information from users’ personal data, such as their voices.

Leave a Comment