Your Ad Here

INSPIRATION & MOTIVATION

INSPIRATION & MOTIVATION

My Work Profile

Earn wirh Bux

Best of the best bux family 2nd Best Bux Family 3rd Best Bux Family
"I like my new telephone, my computer works just fine, my calculator is perfect, but Lord, I miss my mind!. "

Monday, January 3, 2011

Computers with emotions



Can computers understand emotions? Can computers express emotions? Can they feel emotions? The latest video from the University of Cambridge shows how emotions can be used to improve interaction between humans and computers.

When people talk to each other, they express their feelings through facial expressions, tone of voice and body postures. They even do this when they are interacting with machines. These hidden signals are an important part of human communication, but computers ignore them.

Professor Peter Robinson is leading a team in the Computer Laboratory at the University of Cambridge who are exploring the role of emotions in human-computer interaction. His research is examined in the film The Emotional Computer.

FWe're building emotionally intelligent computers, ones that can read my mind and know how I feel," Professor Robinson says. "Computers are really good at understanding what someone is typing or even saying. But they need to understand not just what I'm saying, but how I'm saying it."

The research team is collaborating closely with Professor Simon Baron-Cohen's team in the University's Autism Research Center. Because those researchers study the difficulties that some people have understanding emotions, their insights help to address the same problems in computers.

Facial expressions are an important way of understanding people's feelings. One system tracks features on a person's face, calculates the gestures that are being made and infers emotions from them. It gets the right answer over 70% of the time, which is as good as most human observers.

Other systems analyze speech intonation to infer emotions from the way that something is said, and analyze body posture and gestures.

Ian Davies, one of the research students in Professor Robinson's team, is looking at applications of these technologies in command and control systems. "Even in something as simple as a car we need to know if the driver is concentrating and confused, so that we can avoid overloading him with distractions from a mobile phone, the radio, or a satellite navigations system."

Merely understanding emotions
is not enough. Professor Robinson wants computers to express emotions as well, whether they are cartoon animations, or physical robots.
PhD student Tadas BaltruĊĦaitis, another team member, works on animating figures to mimic a person's facial expressions, while fellow PhD candidate Laurel Riek is experimenting with a robotic head modelled on Charles Babbage, which appears in the film.

"Charles has two dozen motors controlling 'muscles' in his face, giving him a wide range of expressions," Robinson explains. "We can use him to explore empathy, rapport building, and co-operation in emotional interactions between people and computers."

No comments:

Post a Comment

guarantee-referrals

The On Demand Global Workforce - oDesk