This technology is designed to solve a problem. That is why it is important to exactly define which problem this technology is going to solve. Can you make a clear definition of the problem? What 'pain' does this technology want to ease? Whose pain? The problem definition will help you to determine and discuss if you are solving the right problem.
Can you imagine ways that this technology can or will be used to break the law? Think about invading someone's privacy. Spying. Hurting people. Harassment. Fraud/identity theft and so on. Or will people use this technology to avoid facing the consequences of breaking the law (using trackers to evade speed radars or using bitcoins to launder money, for example).
If this technology registers personal data you have to be aware of privacy legislation and the concept of privacy. Personal data can be interpreted in a broad way. Maybe this technology does not collect personal data, but can be used to assemble personal data. If this technology collects special personal data (like health or ethnicity) you should be extra aware.
To answer this question think about sub questions like: Can the technology be perceived as stigmatising? Does the technology imply or impose a certain belief or world view? Does the technology affects users' dignity? Is the technology in line with the person the user wants to be perceived as?
For the Quick Scan, you only have to list the stakeholders. Can you think of the people that are directly or indirectly affected by this technology? There are a lot of stakeholders that are obvious (like users) but we invite you also to think about the less obvious ones. Missing a stakeholder can have great consequences.
Later, it helps to think about further questions. Questions like: Can you write down in a few words in what manner the users or stakeholders will be affected by this technology? You can limit yourself to the main / core effect (you think) this technology will have on the stakeholders. Did you really consult a stakeholder? Did you consult all stakeholders listed or did you assume the position of some stakeholders? Are you going to take the stakeholder into account? Do you think you should take all stakeholders into account? Are there any conflicting interests between groups of stakeholders? How will you resolve these conflicts?
There are fundamental issues with data. Data is always subjective. Data collections are never complete. Correlation and causation are tricky concepts. Data collections are often biased. Reality is way more complex than a million datapoints. Are you aware of these issues? How does this technology take these issues into account?
We strongly recommend to do crash course four to properly understand the pitfalls and shortcomings of data (and it is fun!).
Do a brainstorm. Can you find a built-in bias in this technology? Maybe because of the way the data was collected, either by personal bias, historical bias, political bias or a lack of diversity in the people responsible for the design of the technology? How do you know this is not the case? Be critical. Be aware of your own biases.
Is it easy for users to find out how your technology works? Can a user understand or find out why your technology behaves in a certain way? Are the goals explained? Is the idea of the technology explained? Is the technology company transparent about the way their business model works?
One of the most prominent impacts on sustainability is energy efficiency. Consider what service you want this technology to provide and how this could be achieved with a minimal use of energy.
Discuss this quickly and note your first thoughts here.