Emotion AI has the potential to fundamentally change the way we work, live, and interact with ourselves and the world around us. Developing solutions with these technologies requires AI algorithms to learn from large-scale, often sensitive datasets which makes preserving people’s privacy a great challenge. The issue of trust is pervasive when it comes to the ethics of Big Data. Even more so after the Facebook-Cambridge Analytica data scandal in which millions of American citizens´s “psychometric profiles” were harvested and used to influence the 2017 US presidential elections. This is why Feelenials has started a #NotMyDataMovement, which advocates for people’s right to data and privacy.
Like any technology, emotion AI is a tool. It’s neither good nor evil. From aiding mental illnesses to avoiding criminal activities, the positive possibilities are endless. Unfortunately, so are the negative ones. Nevertheless, it is what we do with the AI and how that sets the impact it has on society. As technology becomes an increasingly important and integrated part of people’s lives and organizations functions, data ethics must be translated into sound business practices that ensure that internal and external interests are balanced. Implementing data ethics begins by considering the human impact from all sides of data use and heavily analysing whether those impacts are beneficial, neutral, or potentially risky for people and the society. We must do more than build data-protection safeguards to avoid data manipulation and breach of trust of the end users, we must understand the responsibility of data ethics.
Our answer is: data anonymization and privacy by design. It is achieved through a series of techniques used to disassociate an individual’s record from their identity in a particular dataset. If the data cannot be associated with the individual to whom it relates, it cannot harm that person. An anonymization strategy to protect the identity of participants is critical to Feelenials business values. Planning anonymization before undertaking collection allows users to have better informed consent of what we actually want to do with our emotion AI technology. When measuring emotions, we make sure to let our users know exactly what information is captured and what it is used for. We are not interested in storing your personal information, we are only interested in capturing the global emotional biorhythms in an anonymous and transparent way. At Feelenials we do not store any personal information or data, for example in the case of facial recognition, the photo is destroyed as soon as the emotion is analyzed, therefore no personal or identifiable information is tied to our emotional records. Feelenials defaults into privacy which is embedded into the product design and development from start to finish.We don’t create personal profiles of you or save any of your personal information. We’re interested in emotions, not your data.
We are currently experiencing rapid technological shifts and generational shifts which are driven by globalization of the economy and profound turbulence in the digital, physical, and biological dynamics we find ourselves in. To keep up with the ever expanding digital world, sustainable data ethics codes must evolve and go beyond tick-the-box compliance and enforcement of the rules.
Here‘s morerelated Content
Thursday September 17th, 2020