If this category is not applicable for this technology, please select 'Yes'.
Who will have access to this technology and who won’t? Will people or communities who don’t have access to this technology suffer a setback compared to those who do? What does that setback look like? What new differences will there be between the “haves” and “have-nots” of this technology
Does your Technology have a built in bias because of the way the data was collected, either by personal bias, historical bias, political bias or a lack of diversity in the people responsible for the design of the technology? How do you know this is not the case? How do you deal with your own biases?
How will you push back against a blind preference for automation (the assumption that AI-based systems and decisions are correct, and don’t need to be verified or audited)? Can you explain a decision? Or do you have alternate procedures in place?
How much is the disruptive impact on current social structures and economic stability? What existing social relations will be disturbed and what economic activities will be reduced? Are new social relations and economical opportunities replacing the old ones or is only a small group benefitting from the new technology?
The design of technology is often a reflection of the people that are designing the technology. A diverse team provides the opportunity to see the technology in diverse ways? How diverse is your team? Do you see a team that is a representation of the target group? Is your team able to build an inclusive technology?
The idea of the technology impact tool is that it stimulates you to think hard about the impact of your technological solution. This can lead to design - changes, which is a good thing. Please list them here.