A recent study discovered that people are faster at recognizing emotion when looking at emojis when compared to real faces, a discovery that could hold a number of surprising consequences for the future of human language as blending with technology continues.
Researchers made several other surprising discoveries about just how good people are at recognizing the emotions portrayed by emojis but it’s important that we define these modern-day hieroglyphs so that you can better understand how the study was conducted.
People would use a combination of symbols like colons, dashes, and brackets to add a little emotion to their conversations in order to get across their happiness, frustration, or sarcasm. The first proper emojis you would recognize weren’t created until 1999.
Japanese artist Shigetaka Kurit created the first digital representation of emotions for a mobile phone company in Japan, creating 176 12-pixel by 12-pixel images that quickly grew in popularity as they allowed people to add emotional context to their messages.
Wired noted that a message like “I understand” might sound cold to the recipient while a heart emoji might convey the message they wanted to send while also offering a “sense of warmth and sympathy.” Wired wrote: “It was the beginning of a new visual language.”
That visual language has blossomed into a worldwide phenomenon and one that’s now trumped our ability to recognize emotions on real human faces according to a new study from a group of researchers in Italy who published their findings in Social Neuroscience.
The aim of the study was to look at which parts of the brain were used by participants when identifying an emoji as well as how long it took for them to recognize the symbol while they engaged in a “categorization task involving an emotional word paradigm,” the study’s authors noted.