Smart-home devices: 3rd-party privacy risks


Rewind to 1996 when I landed a job at Cern in Geneva and started a phase of my life which changed me forever. One of the exceptional engineers I met (Ivan) had configured his home into a primitive version of the ‘smart-home’ although it wasn’t called that then.

Everything was connected to a dashboard. He knew every time someone entered or left the house, every time someone visited the bathroom and how long. He had video connected which he could access from his PC. I think he had also programmed other functional aspects of the house, such as lighting, although I am not sure. What I do remember is how myself and my work colleagues, although impressed by his home were sceptical of the privacy implications. Myself and my female colleagues could not imagine living in a house whereby our partner knew every time a bathroom visit happened and for how long.

How short-sighted we were. I am now really for the first time taking a dive into smart-technologies in the home. I haven’t even started yet and am challenged to identify the controller-processor or controller-controller accountability? What data is shared with Google or Amazon, and who is accountable? I’ve looked around to see what other smart-home product vendors are writing on their privacy notices, and I have found nothing yet. The page from Google describes how their own products are working pertaining to privacy. But still nothing on what happens with 3rd-party cameras not populating the market. This blog Post is me brainstorming with myself.

Looking at the components of a smart-device, using a thermostat as our example: (1) thermostat, (2) Google Assistent/Alexis (3) App/code on smartdevice in Google Assistent/Alexis dashboard. So what is shared and where does it go?

(1) The termostat will have its own memory chip, enough to store and send data onward. In the old days data would be stored in a temporary cache, but nowadays, devices are never switched off, and the temporary cache is normally supported by a permanent memory on a hardware chip. Risk is if you sell the device and there is no hardware/factory reset button to clean the chip. This is not a high privacy risk as a thermostat is not highly sensitive data unless the temperature is set to unusually high which is the case when someone is sick or a new baby has arrived in the household. This could be quite an issue if the smart device is a camera, such as the incident with the Google Nest Indoor Cam.

(2) What is shared with Google Assistent/Alexis? The most publicity has been with the voice data collected and how it has been used. The most talked-about privacy invasive issues I’ve come across so far are (i) whereby background noise has been collected continuously, (ii) whereby the voice commands have been collected for the purpose of triggering some action, e.g. switch lights on, and in addition has been used by Google/Alexis to improve their voice recognition services without informing the users, i.e. lack of transparency in data collection and use practices.

(3) The App itself may collect other data in order to deliver the service, e.g. GPS/location data which is sent to the provider of the App, question if this flows via Google/Alexis? I guess so, as the device manufacturer is not creating their own App, they have created a piece of code to

What I see is that a smart-device which is connected to Google/Alexis is that the user is sharing their voice-data with Google/Alexis. This is not biometric data, it is voice-data. Voice-data which is not shared with the provider of the device. It is Google/Alexis which translated the voice-data into a digital format which the device can understand and act upon.

This means that (1) Google/Alexis needs to authenticate to the device App/code, which needs to provide just authorisation to send the instruction and nothing more, and (2) Google/Alexis are sharing instructions (from the voice-data) with the smart-device. (3) If (1) is done correctly, no data is sent from a 3rd-party smart-device to Google/Alexis.

I’m not sure that I’m missing anything here, but IMHO the risks to the provider of the smart-device are to ensure the code created to pop into the Google/Alexis hub/dashboard is purely to authenticate (2-way) and share a one-way instruction (from the hub to the device) on what the device must do?

Although I guess I’m forgetting here about the contextual data which is sent back to Google/Alexis hub in order to make decisions?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.