Near future
No Result
View All Result
June 1 2023
  • Home
  • Tech
  • Health
  • Environment
  • Energy
  • Inland solutions
  • Spazio
  • AI
  • concepts
  • H+
Understand, anticipate, improve the future.
CES2023 / Coronavirus / Russia-Ukraine
Near future
  • Home
  • Tech
  • Health
  • Environment
  • Energy
  • Inland solutions
  • Spazio
  • AI
  • concepts
  • H+

Understand, anticipate, improve the future.

No Result
View All Result
Technology

Here's how EmoNet learned to read emotions

The system, called EmoNet, not only looks at facial expressions to make sense of emotions, but also general consensus.

August 7 2019
Gianluca RiccioGianluca Riccio
⚪ 3 minutes
Share18Pin3Tweet7SendShare2ShareShare1

READ THIS IN:

Marianne Reddan has spent the past 10 years scrutinizing human faces to find traces of two distinct but very close emotions. Surprise and Fear. And after so long he has barely learned to tell them apart.

This is why Reddan, with a postdoctoral degree from Stanford University, has understood that something will change. He understood this when he learned that EmoNet, a system based on machine learning, learned to distinguish the two emotions.

The system, called "EmoNet," doesn't just look at facial expressions to make sense of emotions. Also look at the general context to determine general feeling, as a flesh-and-blood person would do.

To carry out this research (published in the journal Science Advances) and this "trained" neural network with large amounts of data, researchers at the University of Colorado Boulder and Duke University took a year to develop. 

From objects to emotions

Reddan and colleagues used AlexNet. It is a model of deep learning (created with the dynamics of the visual cortex) that trains the computer to recognize objects. They reprogrammed it to review emotions instead of objects.

The article continues after the related links

The future is a factotum: chatbots will transform relationship marketing

Look what's on my mind: an AI that transforms thoughts into HQ Videos

Philip Kragel, a researcher at the University of Colorado Institute of Cognitive Sciences, provided the neural network with 25000 images and arranged for it to divide them into 20 categories of emotions.

The large list included emotions such as anxiety or boredom, as well as other less common emotional experiences, such as "aesthetic complacency" or "empathic pain".

In the second phase, categorized emotions were compared with human ones. 8 volunteers connected as a functional magnetic resonance imaging observed 112 images. Their brain activity was measured in parallel by the neural network, to associate it with the images (and emotions) already in its possession.

Building a neural network that reproduces the human brain is a scientific challenge that has lasted for years. Yet even the most advanced machines trudge ahead of the range of human experiences. "Emotions are a huge part of our daily lives," says Kragel. "If the neural networks do not decipher them properly, they will always have a limited knowledge of how the brain works."

Kragel was surprised at how well EmoNet works, but that doesn't mean the system is already perfect. The two most accurately mapped categories are "sexual desire" and "greed / craving," but sometimes it doesn't work well with dynamically expressed emotions. Surprise, for example, which can rapidly evolve into joy or anger depending on the situation. EmoNet also has great difficulty in finding differences and nuances between emotions such as adoration, fun and joy, due to their intimate correlations.

Are there any risks?

Hannah davis, a professor of generative music at New York University, believes that teaching emotions to a computer is not dangerous. "It would be dangerous," he says, "if we started to distinguish emotions with the same schematism and the same scarcity of nuances."

How to blame her? Coding an emotion from a photo does not mean understanding it or feeling empathy. And already today with social networks we can have the perception that people have restricted their emotions to the number of emoticons they can find.

“Is the model able to feel emotions? Definitely no. He is only reading in some categories, certainly not in the complexity of the human experience. Could he feel emotions in the future? This cannot be ruled out. Perhaps."

Maybe.

Tags: AIdeep learningartificial intelligenceMachine learning


GPT Chat Megaeasy!

Concrete guide for those approaching this artificial intelligence tool, also designed for the school world: many examples of applications, usage indications and ready-to-use instructions for training and interrogating Chat GPT.

To submit articles, disclose the results of a research or scientific discoveries write to the editorial staff

Most read of the week

  • Oculus gives VR visit to the Anne Frank house

    267 Shares
    Share 107 Tweet 67
  • Here comes a new magnetic field reversal. We are ready?

    4 Shares
    Share 1 Tweet 1
  • Goodbye camera: here is the electronic eye that "sees" like us

    9 Shares
    Share 1 Tweet 1
  • Even the giants "miss": 5 prophecies about the future that are blatantly wrong

    4 Shares
    Share 1 Tweet 1
  • Hibernation "on demand": steps towards long space travel

    6 Shares
    Share 1 Tweet 1

Enter the Telegram channel of Futuroprossimo, click here. Or follow us on Instagram, Facebook, Twitter, Mastodon e LinkedIn.

The daily tomorrow.


Futuroprossimo.it provides news on the future of technology, science and innovation: if there is something that is about to arrive, here it has already arrived. FuturoProssimo is part of the network ForwardTo, studies and skills for future scenarios.

FacebookTwitterInstagramTelegramLinkedInMastodonPinterestTikTok
  • Environment
  • Architecture
  • Artificial intelligence
  • Gadgets
  • concepts
  • Design
  • Medicine
  • Spazio
  • Robotica
  • Work
  • Inland solutions
  • Energy
  • Edition Francaise
  • Deutsche Ausgabe
  • Japanese version
  • English Edition
  • Portuguese Edition
  • Русское издание
  • Spanish edition

Subscribe to our newsletter

  • The Editor
  • Advertising on FP
  • Privacy Policy

© 2022 Near future - Creative Commons License
This work is distributed under license Creative Commons Attribution 4.0 International.

No Result
View All Result
Understand, anticipate, improve the future.
  • Home
  • Tech
  • Health
  • Environment
  • Energy
  • Inland solutions
  • Spazio
  • AI
  • concepts
  • H+