Near future
Contact us
  • Home
  • Tech
  • Health
  • Environment
  • Architecture
  • energia
  • Transportation
  • Spazio
  • AI
  • concepts
  • Gadgets
  • Italy Next
  • H+
July 6 2022

Coronavirus / Russia-Ukraine

Near future

News to understand, anticipate, improve the future.

No Result
View All Result

News to understand, anticipate, improve the future.

Read in:  Chinese (Simplified)EnglishFrenchGermanItalianJapanesePortugueseRussianSpanish

Here's how EmoNet learned to read emotions

The system, called EmoNet, not only looks at facial expressions to make sense of emotions, but also general consensus.

Gianluca Ricciodi Gianluca Riccio
in Technology
Share17Pin2Tweet7SendShare2ShareShare1
August 7 2019
⚪ Reads in 3 minutes
A A

Marianne Reddan has spent the past 10 years scrutinizing human faces to find traces of two distinct but very close emotions. Surprise and Fear. And after so long he has barely learned to tell them apart.

This is why Reddan, with a postdoctoral degree from Stanford University, has understood that something will change. He understood this when he learned that EmoNet, a system based on machine learning, learned to distinguish the two emotions.

The system, called "EmoNet," doesn't just look at facial expressions to make sense of emotions. Also look at the general context to determine general feeling, as a flesh-and-blood person would do.

To carry out this research (published in the journal Science Advances) and this "trained" neural network with large amounts of data, researchers at the University of Colorado Boulder and Duke University took a year to develop. 

Maybe you are also interested

AAA wanted algorithm of happiness

Mr.Shadow, the song composed by the AI, I like it a little and it worries me a little

Microsoft will (without haste) withdraw its technology that reads emotions

Petvation: door with facial recognition for pets

From objects to emotions

Reddan and colleagues used AlexNet. It is a model of deep learning (created with the dynamics of the visual cortex) that trains the computer to recognize objects. They reprogrammed it to review emotions instead of objects.

Philip Kragel, a researcher at the University of Colorado Institute of Cognitive Sciences, provided the neural network with 25000 images and arranged for it to divide them into 20 categories of emotions.

The large list included emotions such as anxiety or boredom, as well as other less common emotional experiences, such as "aesthetic complacency" or "empathic pain".

In the second phase, categorized emotions were compared with human ones. 8 volunteers connected as a functional magnetic resonance imaging observed 112 images. Their brain activity was measured in parallel by the neural network, to associate it with the images (and emotions) already in its possession.

Building a neural network that reproduces the human brain is a scientific challenge that has lasted for years. Yet even the most advanced machines trudge ahead of the range of human experiences. "Emotions are a huge part of our daily lives," says Kragel. "If the neural networks do not decipher them properly they will always have a limited knowledge of how the brain works."

Kragel was surprised at how well EmoNet works, but that doesn't mean the system is already perfect. The two most accurately mapped categories are "sexual desire" and "greed / craving," but sometimes it doesn't work well with dynamically expressed emotions. The surprise, for example, which can quickly evolve into joy or anger depending on the situation. EmoNet also has great difficulty finding differences and nuances between emotions such as adoration, fun and joy, due to their intimate correlations.

Are there any risks?

Hannah davis, a professor of generative music at New York University, believes that teaching emotions to a computer is not dangerous. "It would be dangerous," he says, "if we started to distinguish emotions with the same schematism and the same scarcity of nuances."

How to blame her? Coding an emotion from a photo does not mean understanding it or feeling empathy. And already today with social networks we can have the perception that people have restricted their emotions to the number of emoticons they can find.

“Is the model capable of feeling emotions? Definitely no. It is just reading in some categories, certainly not in the complexity of the human experience. Could he feel emotions in the future? I cannot exclude this. Maybe."

Maybe.

tags: AIdeep learningartificial intelligenceMachine learning
Previous post

NeuWai, the electric motorcycle takes the Silk Road

Next Post

RF-TENG: powered electronic clothes and waterproof clothes

COLLABORATE

To submit articles, disclose the results of a research or scientific discoveries write to the editorial staff

    archive

    Have a look here:

    Accumulate solar energy
    energia

    A new material can store solar energy and store it for years

    Lancaster University almost accidentally discovers a material that could store solar energy and store it for months. Maybe years.

    Read More
    Summer 2022, Facebook Smartwatch: It's time for war with Apple.

    Summer 2022, Facebook Smartwatch: It's time for war with Apple.

    4 bridges that will rewrite the future of humanity

    4 bridges that will rewrite the future of humanity

    coronavirus economics

    The Coronavirus and the end of the traditional economy

    Colonies on Mars: breathable air and fuel from the "brine"

    Colonies on Mars: breathable air and fuel from the "brine"

    The daily tomorrow

    Futuroprossimo.it provides news on the future of technology, science and innovation: if there is something that is about to arrive, here it has already arrived. FuturoProssimo is part of the network ForwardTo, studies and skills for future scenarios.

    Subscribe to our newsletter

    Environment
    Architecture
    Artificial intelligence
    Gadgets
    concepts
    Design

    Staff
    Archives
    Advertising
    Privacy Policy

    Medicine
    Spazio
    Robotica
    Work
    Transportation
    energia

    To contact the FuturoProssimo editorial team, write to [email protected]

    Chinese Version
    Édition Française
    Deutsche Ausgabe
    Japanese version
    English Edition
    Edição Portuguesa
    Русское издание
    Spanish edition

    This work is distributed under license Creative Commons Attribution 4.0 International.
    © 2021 Futuroprossimo

    No Result
    View All Result
    • Home
    • Tech
    • Health
    • Environment
    • Architecture
    • energia
    • Transportation
    • Spazio
    • AI
    • concepts
    • Gadgets
    • Italy Next
    • H+