Sure, we all know that when it comes to privacy some adjustments must be made in the name of technology but in recent times we’ve become aware of something quite alarming. If you have one of these home speaker devices Google is recording your audio and it is being listened to by the company.
These smart speakers come in handy, you can control music, your lights, and so much more from them depending on how you have things set-up. More and more people are opting to get the Google home device or things like the Amazon Alexa. These devices might make things a lot easier but are they invading more of our privacy than is necessary?
In a statement Google noted as follows:
“We partner with language experts around the world to improve speech technology by transcribing a small set of queries – this work is critical to developing technology that powers products like the Google Assistant.”
“Language experts only review around .2 percent of all audio snippets, and these snippets are not associated with user accounts as part of the review process.”
That being a huge amount of data considering the number of people who use these devices but something more concerning is that one of those who review this audio feedback violated their data security policies and who knows how many others have or will in the future. This kind of thing can go a number of ways, and we are not yet sure what all was compromised in this go around.
The statement continues as follows:
“We just learned that one of these reviewers has violated our data security policies by leaking confidential Dutch audio data. Our Security and Privacy Response teams have been activated on this issue, are investigating, and we will take action.”
“We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again.”
You see, this device and others like it begin recording audio when someone says things like “Ok, Google.” From here the device’s recordings are sent in and listened to by contractors who work to better understand different languages, patterns, and so forth through the AI itself. Turning this feature off makes the device a lot more ‘dumbed down’ and leaves users unsatisfied at best.
These compromised recordings were somehow obtained by the Belgian Public Broadcaster VRT which reviewed them and found that over 100 of the 1000 recordings they were given were captured by accident. As it turns out at least one included a person’s actual address and other personal information.
VRT NWS listened to more than a thousand excerpts, 153 of which were conversations that should never have been recorded and during which the command ‘Okay Google’ was clearly not given.
But as soon as someone in the vicinity utters a word that sounds a bit like ’Okay Google’, Google Home starts to record.
This means that a lot of conversations are recorded unintentionally: bedroom conversations, conversations between parents and their children, but also blazing rows and professional phone calls containing lots of private information.
Mistaken recordings can also occur when someone presses the wrong button on his or her phone or unintentionally gives a command.
… and violence
One of our three independent sources says he had to describe a recording where he heard a woman who was in definite distress. What are employees supposed to do with such information? We are told that there are no clear guidelines regarding such cases. It is, however, an important ethical matter. Employees only receive specific directions when it comes to account numbers and passwords. Those are marked as sensitive information.
The people that analyze these recordings also overhear lots of medical questions. You wouldn’t be the first to consult Doctor Google when you have some sort of illness.
The recordings also strikingly confirm one of the Internet’s rules: men seem to look for porn a lot, even via smart speakers.
Do you use a smart speaker of some kind? How much of what you say do you think is shared elsewhere? I for one think things like this are at least things we should be aware of.