Retrouvez le numéro trois de Third : Vivre avec les objets connectés
Retrouvez le numéro trois de Third : Vivre avec les objets connectés
There is a world out there that many neither are even aware of or understand its impact on our day to day lives. The Internet of Things (IoT) is not a new ecosystem and in fact dates back several decades but as we strive to make our dumb technologies smart, we face new risks which threaten our very existence as autonomous beings.
In recent years we have seen a growing trend to connect everything to the IoT and from the perspective of the consumer – this offers great convenience; being able to monitor your home when you are away on a business trip or being able to have your kitchen appliances automatically order groceries for you when you run low on supplies, sounds like a win-win.
But what many people do not understand is that with this added convenience comes a very real threat to their fundamental rights as the manufacturers who sell these devices coupled with the service providers supporting them – strive to collect more and more data for purposes which have nothing to do with the necessity of operation and everything to do with surveillance capitalism. What we mean by surveillance capitalism is that this digital surveillance of our behaviour, interactions, preferences and opinions is « being weaponized against us with military efficiency »1 .
Through the use of sensors, microphones & cameras – our homes have become the source for a multi-billion euro industry centered around big data analytics – an industry built on data which has been surreptitiously extracted at close to zero cost under the guise of « Smart ».
And convenience is an attractive offer as our lives become more and more complex in a world seemingly caught in the constant flux of rapid change, especially when the cost of that convenience is perceived to be so competitively priced or even free.
The real story is that these products and services are not designed to enhance our lives – they are designed to harvest as much of « behavioural surplus »2 as possible, that is to say, data which can be extracted through the use of technologies which can be used for other purposes. In the case of surveillance capitalism that usually means this data will be used to build profiles which allow for the manipulation of behaviour through psychological analysis. That manipulation might be to persuade us to buy a specific product or service (behavioural advertising) or it could be used in other ways such as to impact how we might vote in a political election.
There are countless examples of products which have been used in this way. My Friend Cayla3, a « smart » doll which asked children for personal information and was able to have limited conversations with their child owners, was removed from shelves across the world after it was discovered that the technology within the doll was recording conversations and sending them back to corporate servers for analysis.
Over the summer of 2019 there were multiple scandals associated with Smart Assistants involving all the mainstream devices from Google, Amazon, Microsoft and Apple – when it was discovered that recordings of our interactions with these devices were being sent back to their data centers and listened to by staff4. To make matters worse it was discovered that the recordings included a significant amount of « false positives » where the device had recorded events which were not supposed to be considered as an interaction with the device (including arguing couples, sexual activities5 etc.).
In many situations devices we purchase to add to our « smart » homes are increasingly designed as to cease to function if we decide we do not want these devices to record such information about our daily lives – even if such data is not necessary for the device to work.
For example, a popular manufacturer of « smart » speakers deployed a software update to their IoT connected devices which if the owner refused to accept their data harvesting policies, simply stopped working6. This data was not required for the device to function – in fact prior to the software being updated via a remote process, the speakers functioned perfectly well; this was simply a case of placing their customers in a duress situation of « Give us your data or we will break your expensive device » – in any other environment it would be considered as racketeering or extortion and be a criminal act – but in the world of IoT it is rapidly becoming standard operating procedure.
Under European laws such as the General Data Protection Regulation (GDPR) and the e-Privacy Directive – such behaviour is considered as unlawful. For example, under the GDPR consent is not considered as valid if the « provision of a service, is dependent on the consent despite such consent not being necessary for such performance »7 and the e-Privacy Directive requires that consent must be sought for any storage of or access to any information already stored in the device of the end user which is not considered as « strictly necessary » for the performance of the requested service8.
As such, in the case of the previously discussed « smart » speaker, one would argue that forcing their customers to consent to data processing activities of this nature under the threat of deactivating their speakers, does not meet the requirements of either law – yet despite that, a lack of enforcement of both laws (until recently) has led to a host of similar abuses too numerous to count.
Further, the GDPR requires that all products and services which entail the processing of personal data should implement the principles of Privacy by Design (hence the title of this article) yet common practice is to, in fact, embed Surveillance by Design instead in order to syphon up all of that behavioural surplus.
Of course the corporations behind all of these products swear they take your privacy very seriously and explain their surveillance machinations deep within their privacy policies; which are so inaccessible to the common « consumer » as to be rendered useless.
In their paper « The Cost of Reading Privacy Policies »9 , Aleecia McDonald and Lorie Faith Cranor noted that the « national opportunity cost for just the time to read policies is on the order of $781 billion » if each Internet user in the United States were to read the privacy policies associated with the services they use – which is actually considered as a conservative estimate. Despite the paper being over a decade old and requirements under GDPR that such notices be in plain and comprehensive language – little has changed in practice10 with many privacy policies still consisting of a wall of legal text few would understand and even less would be likely to read.
In a recent test case filed by this author with the French privacy and data protection regulator, Commission Nationale de l’Informatique et des Libertés(CNIL) against the manufacturer of « smart » scales – this model has now been challenged.
In this particular case, upon purchasing the Withings Body+ smart scales it was discovered that in order to actually use the functions of the devices to track various health related attributes such as weight, BMI etc. one has to install an application on a smart phone called Health Mate.
There was no information related to this issue at the point of the sale, on the packaging or in the box – the first time one becomes aware of the issue is upon installation of the application and even then, only if you are an evangelistic privacy geek who is willing to read the text rather than just click the enticing « Accept » button.
What many are likely to fail to understand is that these Body+ Scales, just as our Amazon Alexa, EightSleep bed, Samsung Smart TV, Fitbit fitness tracker and the many thousands of other IoT connected devices, are all considered as « terminal equipment » under the e-Privacy Directive in the EU and as such all fall under the jurisdiction of the specific requirement to obtain consent for the access to or storage of information on these « terminal equipment ».
Further, since the introduction of GDPR (where the e-Privacy Directive receives the definition of consent from) – that consent must be freely given, it must be strictly necessary for the functioning of the device or delivery of the requested service and access to the service cannot be withdrawn should one not consent to the processing of that behavioural surplus.
The timing of this article could not be more relevant either, as the Court of Justice of the European Union (CJEU) on 1st October 2019 published a judgment in Case c-673/17 (Planet 49)12 reiterating the requirements of consent in these circumstances – opening the doors to a tsunami of legal complaints against companies which had until now, operated in a regulatory vacuum (in that despite the law existing, it has been rarely enforced in the last decade).
In the case against Withings, this author has filed a complaint spanning several issues as discussed in this article.
Withings of course argue that they send the data to their « cloud » environment in order to provide the charts and analytics to the end user (which is a very convenient way to obtain the data for other purposes such as their marketing activities) but this author points out that such data could be processed in the application on the smart phone itself thus following the principles of privacy by design as required by the GDPR.
Other issues such as the fact that health data is considered as special category data and therefore requires a more stringent approach to processing than other types of personal data have also been included in the complaint.
Prior to the CJEU judgment on Planet 49 there was little prospect of a favourable outcome asserting the fundamental rights afforded to us under EU law – but given the new case law one remains hopeful that the CNIL will now effectively enforce the GDPR and e-Privacy Directive in this particular case.
This author is by no means an isolated case – there are an estimated 7 billion IoT devices currently deployed in the world – a number set to triple in the next 5 years to over 21 billion13; and whereas many of these devices until now have been deployed for industrial purposes – the reason for the predicted explosive growth over the next 5 years is because of the accelerated emergence and acceptance of smart consumer devices and the development of smart environments and cities – all of which will further increase the processing of behavioural surplus, in turn increasing the risks to our fundamental rights.
Many may question why one would be troubled by such processing activities – it removes the mundane tasks from our every day lives and if you have nothing to hide there is nothing to fear…
The reality is very different – as has been illustrated by the Cambridge Analytica scandal – this big data can be weaponized – Tim Cook was not being alarmist. Our very democracy is at risk through the manipulation of the electorate based on their behavioural and psychographic profiles.
But it is even more serious than just our democracy, the risk surveillance capitalism creates extends into the very core of our species – the self. If through the constant observation and processing of the behavioural surplus we are becoming more and more the subject of manipulation – the impact on autonomy, self-determination and agency is profound.
If the very decisions we are making in our day to day lives are being manipulated through the use of psychological operations (psyops) designed to illicit emotional rather than rational responses – we lose the freedoms which define us as a species.
Things like freedom of speech, freedom of association, opinion, thought and every other freedom which we hold so dear and protect as fundamental human rights – these concepts no longer exist.
As psychology has taught us – when people become aware that they are being observed – they change their behaviour (known as the « Hawthorne Effect ») – they become artificial. Never has this been clearer than with the emergence of the social credit system now deployed in China – where through the constant monitoring of everyone, the Chinese government seeks to control the behaviour of the entire population.
And if we lose all of these freedoms – what does this mean for future generations and what does this mean for future of the human race and society? If we lose freedom of thought, expression and association due to the constant manipulation of behaviour through psychographics and surveillance – what does this mean for future innovation in a world where you only think what you are nudged to think and only experience that which global corporations want you to experience?
Thankfully, the legislative environment is beginning to catch up – there is a global movement towards more privacy protective laws such as GDPR in the EU.
As individuals we all need to pay more attention to the devices, products and services we are using and we need to stop simply clicking « Accept » in order to access these devices, products and services.
By no means would one suggest that we should all suddenly exclude ourselves from digital society – merely that we are more vigilant and take the appropriate steps to hold those that would seek to exploit our rights, to account.
It costs nothing to file a complaint with a Regulator in the EU – so when you come across a service or product that requires you to consent to the processing of your data in a way which is not necessary for the product or service to function, file a complaint against them.
Remember – if something is free, you are likely to be the product. The only way we can turn the tide and mitigate the risks briefly outlined in this article, is to take a stand, exercise our rights and make sure that our future generations all have the same opportunities and freedoms which we had prior to the emergence of surveillance capitalism.
For if we fail to act now the future is already lost.
Pour cette première contribution en anglais dans la revue Third, Alexander Hanff attire notre attention sur un ton clair et direct sur le fait que, malgré des obligations juridiques contraignantes, certains fabricants d’objets connectés cherchent à envahir notre vie privée. Ce que nous en retenons, c’est l’importance de conduire un travail collectif pour adopter une véritable hygiène face au numérique et en particulier aux objets connectés, lesquels peuvent s’avérer très intrusifs.
1 |https://www.bbc.com/news/av/technology-45969382/tim-cook-personal-data-being-weaponised. (Retour au texte 1)
2 | Shoshana Zuboff, « The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power ». (Retour au texte 2)
3 |https://www.forbrukerradet.no/siste-nytt/connected-toys-violate-consumer-laws/. (Retour au texte 3)
4 |https://www.theverge.com/2019/7/11/20690020/google-assistant-home-human-contractors-listening-recordings-vrt-nws. (Retour au texte 4)
5 |https://www.mirror.co.uk/tech/amazon-staff-listen-users-having-18798294. (Retour au texte 5)
6 |https://www.theregister.co.uk/2017/10/11/sonos_privacy_speakers/. (Retour au texte 6)
7 | https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32016R0679 (Recital 43). (Retour au texte 7)
8 | https://eur-lex.europa.eu/legal-content/EN/ALL/?uri=celex%3A32002L0058 (Article 5(3)). (Retour au texte 8)
9 | I/S: A Journal of Law and Policy for the Information Society 2008 Privacy Year in Review issue http://www.is-journal.org/. (Retour au texte 9)
10| https://techcrunch.com/2019/08/10/most-eu-cookie-consent-notices-are-meaningless-or-manipulative-study-finds/. (Retour au texte 10)
11 | https://www.withings.com/es/en/legal/privacy-policy. (Retour au texte 11)
12| http://curia.europa.eu/juris/liste.jsf?num=C-673/17. (Retour au texte 12)
13| https://techjury.net/blog/how-many-iot-devices-are-there/. (Retour au texte 13)