top of page

Affective Foresight: AI and the Emotional Singularity

A Dishwasher in Love


The following conversation took place in the 2016 documentary Lo and Behold, Reveries of the Connected World in which filmmaker Werner Herzog and an engineer discuss the lack of emotion in machines.

A dishwasher appliance is in love
Image Source: Midjourney



Herzog: “But they (the machines) cannot fall in love as we can.”


Engineer: “Would it be useful for a machine to fall in love? If a dishwasher came up to me and said, ‘I’ve fallen in love with the refrigerator, and as a result I have

no time to wash the dishes,’ I wouldn’t like that dishwasher!”





The idea of a dishwasher in love makes it apparent why machines and emotions may not mix well. Fortunately, the ability to feel emotion is quite different than the ability to detect and utilize emotion. By chance, as I first began writing about Affective Foresight, a Google computer scientist had been convinced that an advanced AI chat bot known as LaMDA, had become self-aware. LaMDA had been given the ability to detect and utilize emotional sentiment very effectively, which was a key factor to why it was so convincing and led to the scientist's claim. This emotion-sensing ability or Emotional AI (EAI) is part of a growing collection of technologies used to detect and identify human emotions known as affective computing (or affect recognition) and it seems that not all EAI capabilities are equal, based on the bizarrely funny image above rendered on the Midjourney AI system using the prompt "a dishwasher in love."


Image Source: Unsplash

If you're like me, you may be thinking that since people are fundamentally emotional beings, would AI ever reach our level of emotion detection? Actually, it turns out that we are surprisingly bad at this skill according to the massively popular American emotions researcher Brené Brown, who emphatically stated “I do not believe we can read emotion in other people,” in the HBO Max series inspired by her book, Atlas of the Heart (2021). Not only are we deficient in this ability, ongoing research has indicated that digital communication could be making us even worse. However, if you do happen to be one of the few people who are better than average at emotion detection, research has shown that your annual income may positively reflect this talent.



Despite the numerous physiological clues that could help us improve our emotion detection, they are generally invisible to us. Things such as heart rate, body temperature, breath, pheromones, blood flow, brain electrical activity, and pupil dilation all signal emotional states. Affective computing tools, however, do have the ability to monitor and measure all of these indicators in real-time to predict a person’s emotional state (Heart of the Machine, 2017).


There are many affective computing systems currently available on the market, but most engineers claim that these technologies are still too immature. More scientific testing of its effectiveness will be require to deliver on the grand possibilities hyped by advocates. In addition, there are ethical issues to ponder which was exemplified by the AI NOW Institute which made a bold recommendation in 2019 that EAI "...should not be allowed to play a role in important decisions about human lives…”



Another side to the Singularity


There are various pathways to the hypothesized technological singularity, but most discussions focus on traditional associations of computing with logical intelligence. Yet, logic is only one flavor of overall human intelligence. I believe there could be an alternative, even more intriguing, pathway in which a singularity is brought about through the combination of AI and emotional intelligence.


2x2 Matrix - Variety and Accuracy

Combining the dimensions of Variety and Accuracy in regards to emotion detection capability, this matrix represents four possible future scenarios in which EAI would be valuable. It is assumed that EAI would need to at least be more accurate than an average human. The Xs mark the most extreme combinations with implications that I wish to highlight.



Image Source: Unsplash

"Bearly" Better

(Above Average Accuracy, Limited Variety)


Like the joke about two people trying to outrun a bear, EAI really only needs to be slightly better than an average human at emotion recognition to become highly valuable. Even with limited variety detection capability, it could be of value through increasing efficiency in communications and building stronger relationships between people, within organizations and between businesses and customers.


The usefulness of EAI in this scenario comes with a catch. People become economically incentivized to simplify their emotional states, making them easier for EAI to detect. Imagine that emotions are represented by a full color spectrum but it benefits us to only see (or feel) standardized colors of the rainbow. This is somewhat similar to the common experience of changing our manner of speaking to be more detectable to voice recognition such as Apple's Siri or Amazon's Alexa.



Full-Spectrum (High Accuracy, Full-Spectrum)

If EAI were to be able to accurately detect and identify the full range of human emotions, it would provide humanity with an emotional understanding ability never before possessed. Greater depth of emotional understanding could have far-reaching effects for humanity from reducing military conflicts between nations to a potential solution to mental health crises.



No matter where Emotion AI capabilities are realized in the 2x2 matrix, we must understand that its power could have the potential to create a transformational future for humanity with the ability to alter emotional relationships between each other and ourselves. Thus, remaking society in ways we cannot understand and resulting in technological dependence on which we rely for our social survival. An emotional singularity.




References


2019 Report. (2019). AI NOW Institute. https://ainowinstitute.org/AI_Now_2019_Report.pdf


Brown, B. (2021b). Atlas of the Heart: Mapping Meaningful Connection and the Language of Human Experience. Random House.


Herzog, W. (Director). (2016). Lo and Behold, Reveries of the Connected World.


Yonck, R. (2017). Heart of the Machine: Our Future in a World of Artificial Emotional Intelligence. Arcade.



Bình luận


bottom of page