If this category is not applicable for this technology, please select 'Yes'.
Can you imagine ways that your technology can or will be used to break the law? Think about invading someone's privacy. Spying. Hurting people. Harassment. Fraud/identity theft and so on. Or will people use your technology to avoid facing the consequences of breaking the law (evading speed radars or using bitcoin to launder money, for example).
An example: GriefbotYes, under certain circumstances it can be used to break the law. If a young, underaged person dies and the friend or family member of the deceased gets access to the Griefbot account he or she can use it to draw young kids into dangerous situations, as he or she can impersonate a kid.
On the other end, a bad actor can abuse the Griefbot for scamming purposes if he or she can "rewire" the AI behind the bot. Individuals can abuse the trust put into the deceased to manipulate people to undertake illegal activities, e.g. scam them out of money or hurt other people.
It is even possible to imagine that a Griefbot will be taken hostage and only returned to the Original owner after paying a lot of money.
If you brainstorm, can you imagine ways that your technology will be used to hurt, bully or harras individuals? Insult people? Create societal unrest, and so on. Can you write down examples or ways that this can be done?
An example: GriefbotThe Griefbot of a celebrity (in the broadest sense) can be abused to incite violence or other forms of societal unrest even after the person dies. If Griefbots become more common, Fake Griefbots of celebrities can become a problem.
We can imagine that people with strong believe systems have problems with the idea of a Griefbot.
If you brainstorm can you imagine ways in which your technology can be used to hurt, insult or discriminate certain societal groups. For example a face recognition technology that does not recognize people of certain races. Or a technology that is genderspecific? Can you write down examples or ways that this can be done?
An example: GriefbotPoor people can be excluded by the pricing of the product, giving them no acces to the memories of their loved ones.
Individuals who don't have any proficiency in technology also run the risk of being excluded of use.
If you brainstorm can you imagine ways in which your technology increases the gap between classes, gender or races? Do you see bad actors using your technology to polarize society? Can you write down examples or ways in which this can be done?
An example: GriefbotThe only way we can see the Griefbot pitting people together is between either the family members or friends of the deceased. If a certain group finds some information about the deceased undesirable and doesn't want to believe it, it can create unrest between family members.
Also their can be unrest if one side of the family wants the Griefbot and the other does not want to.
Fake Griefbots can also create problems, but that is very speculative.
Can you think of ways in which your technology can be used as equivalent of fake news, bots or deepfakevideos (for example). Can you write down examples or ways in which this can be done?
An example: GriefbotAfter someone's death the Griefbot could be programmed to "hide" certain information, views, motivations or ideas the deceased has. This can potentially be abused to omit bits of the personality of the Griefbot.
The idea of the technology impact tool is that it stimulates you to think hard about the impact of your technological solution (question 1-5). The answer can lead to changes in design or implementation, which is a good thing. Please list them here.
An example: GriefbotYes. We think the Griefbot is a very personal solution, so we created a very personal two-factor security. You can only use the Griefbot with a double authentication.
Also we encrypted all data in our datacenters and made sure security is very high level, so it is also impossible to hack our Griefbots. An eleborate back-up procedures makes sure, that you can always return to the Griefbot of a few weeks/months ago.