Fontys

Technology Impact Cycle Tool

Courses Login
Depending on the technology you are assessing, some categories are more important than others. Here you can indicate how important you think this category is for this technology.
If this category is not applicable for this technology, please select 'Yes.'
Who will have access to this technology and who won’t? Will people or communities who don’t have access to this technology suffer a setback compared to those who do? What will that setback look like? What new differences will there be between the “haves” and “have-nots” of this technology?

(Only the gray shaded text is printed on the Quick Scan canvas)

Do a brainstorm. Can you find a built-in bias in this technology? Maybe because of the way the data was collected, either by personal bias, historical bias, political bias or a lack of diversity in the people responsible for the design of the technology? How do you know this is not the case? Be critical. Be aware of your own biases.

(This question is part of the Quick Scan)

If this technology makes automatic decisions through algorithms or through machine learning can they be verified or explained? Can a decision be explained? If not, how do you know the decision of this technology is not biased and/or inclusive? Are there alternative procedures in place?
How great is the disruptive impact on current social structures and economic stability? What existing social relations will be disturbed and what economic activities will be reduced? Are new social relations and economical opportunities replacing the old ones or is only a small group benefitting from the new technology?
The design of technology is often a reflection of the people that are designing the technology. A diverse team provides the opportunity to see the technology in diverse ways. How diverse is this team? Do you see a team that is a representation of the target group? Is this team able to build an inclusive technology?

(Only the gray shaded text is printed on the Improvement Scan canvas)

If you think about accessibility to this technology. If you think about built in biases or automatic decisions that may be biased. If you think about who is benefitting from this technology and the diversity of the team that creates the technology. If you think about all that, what improvements would you (want to) make? In the technology? In context? In use? The answers on questions 1-5 help you to gain insight into the potential impact of this technology when it comes to fairness. The goal of these answers is not to provide you with a 'go' or ' no go' - decision. The goal is to make you think HOW you can improve the use of the technology. This can be by making changes to the technology or making changes to the context in which the technology is used, or making changes in the way the technology is used.

(This question is part of the Improvement Scan)

Are you satisfied with the quality of your answers? The depth? The level of insight? Did you have enough knowledge to give good answers? Give an honest assessment of the quality of your answers.

Frameworks

How to mitigate the risks of automatic decision systems
(https://technofilosofie.com/wp-content/uploads/2016/12/FPF-Automated-Decision-Making-Harms-and-Mitigation-Charts.pdf)
Implicit Association Test
(https://implicit.harvard.edu/implicit/netherlands/)
Impact Assessment Human Rights and Algorithms (Dutch)
(https://www.uu.nl/sites/default/files/Rebo-IAMA.pdf)

Videos:

Paradise for Nerds (on the risks of not having a diverse designteam)
(https://www.youtube.com/watch?v=pIQuDvHPPuQ&t=7s)