Fontys

Technology Impact Cycle Tool

Login
Depending on the technology your assessing, some categories are more important than others. Here you can indicatie how important you think this category is for this technology.
If this category is not applicable for this technology, please select 'Yes'.

(Only the gray shaded text is printed on the Quick Scan canvas)

If this technology registers personal data you have to be aware of privacy legislation and the concept of privacy. Personal data can be interpreted in a broad way. Maybe this technology does not collect personal data, but can be used to assemble personal data. If this technology collects special personal data (like health or ethnicity) your should be extra aware.

(This question is part of the Quick Scan)

An example: Griefbot

The General Data Protection Regulation defines personal data as data relating to an identified or an identifiable natural person. Natural persons are living persons, so the GDPR in principle does not apply to deceased persons. However, our Griefbot is also filled with data of living persons as well, especially those with a close relation to the deceased, to which data the GDPR will apply. Also the data of the persons that use the griefbot is processed in different ways, and some of these data falls within the scope of the GDPR. For both of these categories we know that some of the data that feeds the Griefbot contains sensitive categories of data as well. For example, some Griefbots are fueled with mail conversations and that means that a lot of very sensitive personal data can be included. A lot of people share with their loved ones very sensitive data like passwords and bankaccounts etc. which can be a huge risk of potential hazard.
Are you aware of the main privacy and data principles you need to adhere to? If not, check the links below. Do you think the invasion of privacy of this technology is allowed? Do you think people will feel that their privacy is invaded? Does this technology consider things like proportionality and subsidiarity?

An example: Griefbot

Yes, the deceased may not have a say in the creation of his/her griefbot. Even though deceased, you might want to protect some aspects of your private life after death. In some countries it is e.g. prohibited to defame deceased people, something that can easily happen when, without knowledge and consent, after death a Griefbot is created. Moreover, not only information about the deceased him/herself can be revealed by a Griefbot, also the data of those that came into contact with the Griefbot. Privacy includes the right to reputation. There are two dangers in this respect. First, that the Griefbot is to static and does not let the deceased evolve like he or she would have in real life. E.g. when someone dies in times where same sex marriage is not accepted, in real life he or she might have come to accept this, but the Griefbot does not automatically. Creating an image of the deceased as a conservative and maybe even discriminatory person. On the other hand, as griefbots are self learning, they can also evolve the deceased into a person he or she was not in real life, e.g. becoming discriminatory on the basis of the information that is fed to the griefbot after death of the actual person.
In Europe (for example) data collections have to be GDPR - compliant. Do you think that this technology is compliant? How do you know? Did the technology provide for the appropriate measures to make sure compliancy is in place?

An example: Griefbot

Privacy is not the same as data protection. Privacy as a fundamental human right is embedded in national and international constitutions and conventions. For the EU important pieces of legislation are the Charter of Fundamental Human Rights of the European Union and the European Convention on Human Rights. On the basis of art. 8 of the European Convention on Human Rights an invasion of privacy can be justified, if it is foreseen by law, on the basis of one of the interests mentioned in article 8 (2) and is necessary in a democratic society. This is judged on the basis of 2 principles, proportionality and subsidiarity. Proportionality means that you can reach the purpose with the measures you take and subsidiarity means that no less invasive means are possible. The GDPR is a very extensive piece of legislation but is based on 8 primary principles: purpose specification, purpose limitation, data minimisation, data quality, security, accountability, transparency and participatory rights. So, to be compliant with privacy and data protection legislations, we strive to design the griefbot with these principles in mind.
Privacy is often implemented by design which means for example that only the bare minimum of data is collected and used and this process is continously improved upon. Does this technology (really) need all the personal data it is processing to provide the functionality of the technology (data minimalisation)? Did the technology consider other principals considering privacy by design?

An example: Griefbot

In view of the principle of data protection by design and data protection by default in the development of the Griefbot consideration must be given to measures to mitigate privacy risks. What kind of data do we need to use to give the Griefbot the functionality we perceive. What exactly is our purpose, and what is needed to achieve this? There would be a difference for example if a Griefbot is specific to one person. So only data concerning person A and the deceased are used for the griefbot that is meant for person A, such as emails and Whatsapp conversations between person A and the deceased, in stead of using the entire email database of the deceased and all Whatsapp messages of the deceased with anyone. It also needs to be considered if other persons data, than those of person A and the deceased can be pseudonimized or preferable anonimised. Security is very important so only person A can "feed" the griefbot and has access to the griefbot.
Is this technology creating data that could follow users throughout their lifetimes, affect their reputations, and impact their future opportunities? Will the data this technology is generating have long-term consequences for the freedoms and reputation of individuals?

An example: Griefbot

For the impact of the data as such, depending on what data exactly will be used to feed the griefbot, we do not see reasons for future impact. As the griefbot functions on the basis of self learning algorithms it might be a problem that the griefbot develops in an unforeseen way, creating a person that no longer corresponds to the actual deceased, displaying behaviour that would not be the behaviour to expect from the actual deceased.

(Only the gray shaded text is printed on the Improvement Scan canvas)

If you think about this technology invading someones privacy or collecting personal data and if you think about the way this technology is compliant with prevailing law and mitigates dataprotection risks and concerns. If you think about all that, what improvement would you (want to) make to this technology? The idea of the technology impact tool is that it stimulates you to think hard about the impact of this technological solution when it comes to privacy and dataprotection (question 1-5). The answers can lead to changes you (want to) make in the design or implementation of this technology, which is a good thing. Please list them here

(This question is part of the Improvement Scan)

An example: Griefbot

We are continously working on being compliant with GDPR law. On a more fundamental level, we are exploring two potential improvements. First we are thinking of a Griefbot that is completely personal, so there is only a one on one relation with the Griefbot. This solves a lot of issues with privacy. Furthermore we think that if the griefbot is personal and has the sole puropose to help one specific purpose grief, it would make sense that the person using the griefbot has some say of control over the development of the griefbot. Second we are exploring the option that a deceased person has to give permission for the Griefbot while alive. Just like a donor codicil.
Are you satisfied with the quality of your answers? The depth? The level of insight? Did you have enough knowledge to give good answers? Give an honest assessment of the quality of your answers.

Videos / Podcasts

Pizza order (privacy awareness campaign)
(https://www.youtube.com/watch?v=-4LtYMNl4yw)
Privacy by Design by Ann Cavoukian
(https://www.youtube.com/watch?v=7_TbOJmz6BA)
Brief explanation GDPR
(https://www.youtube.com/watch?v=j6wwBqfSk-o)
Summary GDPR
(https://www.youtube.com/watch?v=Assdm6fIHlE)
TED talk future of your personal data
(https://www.youtube.com/watch?v=JIo-V0beaBw)

Frameworks

General Data Protection Regulation
(https://eur-lex.europa.eu/eli/reg/2016/679/oj)
European Convention on Human Rights
(https://eur-lex.europa.eu/legal-content/en/TXT/?uri=CELEX%3A31995L0046)
European Convention on Human Rights
(https://www.echr.coe.int/Documents/Convention_ENG.pdf)
English translation Dutch Constitution
(https://www.government.nl/documents/regulations/2012/10/18/the-constitution-of-the-kingdom-of-the-netherlands-2008)

Policy Documents

Opinion Documents of the Article 29 Working Party
(https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/index_en.htm)
Opinion Documents of the Dutch Data Protection Authority
(https://autoriteitpersoonsgegevens.nl/en)
OECD privacy guidelines
(https://www.oecd.org/sti/ieconomy/privacy-guidelines.htm)

Case law

Case Law of the Courts in the Netherlands
(https://www.rechtspraak.nl)

Books / Papers

Pdf of Privacy by Design Strategies
(https://www.cs.ru.nl/~jhh/publications/pds-booklet.pdf)