The question is that sometimes it is VERY useful to use tracking technologies, for example in order to protect vulnerable persons, i.e. small children, and old people (who tend to wander). So the decision by Norrköping kindergarten was a bad one IMHO to not allow the use of tracking – use of armband- of toddlers/small children.
As a parent it would give me peace of mind. Human rights states that we have a ‘right to feel safe’ and ‘a right to a private life’. These rights can often conflict with each other which results in the wrong decisions being made. Hence in fear of breaking the GDPR a school has made a rather incorrect decision which has so many benefits for all. What’s more is that RFID/sensors are not biometrics, so have no relation to the other decision. Sensors do not even need to be linked to an identity. All the school needs to know is if they have lost a child, not which one… that they can work out pretty quickly by seeing which they have.
This presents another problem in that decisions are made by persons who are are not able to take this careful balancing act and really identify the potential risk of harm to the natural person. In the case of Norrköping school I can see none which outweigh the benefits on a ‘right to feel safe’.
Thanks to Inge Frisk for bringing this decision in Norrköping to my attention.
The ruling is in Swedish, but to summarise the school was using facial recognition on its students. Facial recognition is biometric data, hence sensitive (special categories of data in the GDPR). They used consent as the legal basis but this was considered as unlawful due to the imbalance of relationship between the controller (school) and the data subject (student of 16+ yrs). Basically the student had no choice.
But there is more. The Swedish data protection authority based their decision on the following:
Art 5 – personal data collected was intrusive and more was collected that was needed for the purpose
Art 9 – the school did not have a legal exception to handle sensitive data. It is forbidden to collect sensitive data unless this is the case.
Art 35-36 – seems that a DPIA was not done.
What does this mean to other schools or even any public or private entity looking to use intrusive biometrics? Do a data protection impact assessment (DPIA), from here you will be able to get a clean picture on the potential risk of harm to the rights and freedoms of the data subject.
For me personally and professionally, I’m just happy that China’s big brother approach has been nipped in the bud here in Sweden 🙂
Now what happens if this technology is connected to Facebook, so that they don’t need to guess how old you are based on how you look, but can see for a fact. Also that you have children, dogs, cats, and whatever more there is is glean, based upon how Facebook will organise unstructured data into a structured format so it is easy to link and process.
Now facial recognition technologies are also to deny access to casinos in Las Vegas. Now imagine if every club and bar could effectively do away with the traditional bouncer and instead implement this technology.
Previously I have talked a lot about storecards, RFID and how this type of invasion on privacy could make you vulnerable to tailored ads… although maybe you like this, it really depends on your viewpoint. However now, it really may not matter if you have a storecard, RFID, whatever, your face will reveal all, your FB account will feed the digital ads, and I guess you won’t have any say in this at all!
Largest biometric database being created in India with over 1.2 billion identities. Scary and a serious concern outside of the personal privacy aspect is the security of this database. From a positive standpoint and driving argument is that is provides the means to overcome the significant levels of corruption in India. Particularly for those living outside of the city and at the mercy of corrupt officials. In fact this database if implemented well would free these very persons from certain tyranny. Another dilemma ….
Interesting that participation is voluntary. Same as Sweden’s approach.
Read more here. It is fascinating reading and worth a visit 🙂 .
An excellent article on the use of CCTV, biometrics, databases, etc., in schools in the UK.
Can you imagine that on the uncertainly of whether CCTV should be permissible in toilets, Sayner (managing director of Proxis, a security installation company) reasons that “it depends exactly on what it is looking at,” adding that “If you’ve got nothing to hide, why should you object to that?” I just love this “nothing to hide” argument. For myself I’m not too keen on being the star on some camera footage when I visit the ladies room!
It’s not just the FBI that are keen to collect DNA of innocent persons. In Australia Mr McDevitt chief executive of CrimTrac, the agency which maintains the database, said the next step was taking samples from people charged but not convicted and from people charged for minor crimes as well as serious offences. Read more…