- Obama military strategy too weak for future security, panel reports
- Sen. Tom Coburn vows to slow down budget-busting bills ahead of recess
- Obama fantasizes about more executive power, signs new order on federal contractors
- Clintons call Klein, Halper, Kessler ‘a Hat Trick of despicable actors’: report
- Boehner accuses Obama of ‘legacy of lawlessness’
- Pro-marijuana group claims responsibility for Brooklyn Bridge flag swap
- Young adults shun Obamacare mostly due to cost: survey
- Stabbing attack on transgender girl, 15, was ‘bias motivated,’ police say
- LGBT adults still lean overwhelmingly toward Democratic Party
- Lawmakers rattled by Syria genocide horrors, call on Obama to act
Question of the Day
People talking back to a computer is common enough — usually in a moment of pique or frustration. Getting the computer to respond in kind is a far different task, one that computer scientists are undertaking with various degrees of success and consternation.
The challenge isn’t simply a matter of inventing new software and sometimes hardware, difficult enough as that is, but also of coming to grips with some of the ethics involved.
If computers are to have emotional components, what role would they play in everyday life? Do human beings really want an emotional relationship with a mechanical mind?
The field is called “affective technology.” Machine prototypes exist that measure human emotional expression through physiological signals such as facial expressions and voice changes and allow a humanlike response, as described in papers and lectures prepared by Rosalind W. Picard, founder and director of the Media Laboratory’s Affective Computing Research Group at the Massachusetts Institute of Technology. She spoke about her group’s work in May at the American Association for the Advancement of Science in Washington.
The term “affective technology” has different meanings for different groups around the country doing research on human interaction with computers. Graduate student Kirsten Boehner of Cornell University’s Human-Computer Interaction Group, for instance, works on how computers can “cause me to reflect on my own emotion,” rather than on how a computer can imitate emotion.
AT&T; Research Labs in New Jersey, which pioneered speech recognition systems, has yet to attempt incorporating emotion detection, according to AT&T; Labs researcher Rich Cox. But he definitely sees the potential in what he calls “auditory cues in voices that would allow you to detect different kinds of emotion. … Knowing the emotion of the person on the other end [of a conversation] who may help the machine accomplish its task — depending on the task.”
Some of the most mind-bending research under way at M.I.T. focuses on how computers can be made capable of copying certain human skills.
“We’re able to make good guesses — educated guesses — about someone’s emotional state and in some cases approach what humans are able to do,” says M.I.T. graduate student Carson Reynolds, citing research that scientists and mathematicians have done to classify the emotional information that might be found in human speech.
“A tricky thing to note is that humans don’t detect each others’ emotions with perfect accuracy,” he says.
He has worked on a software program called Emotemail that communicates the emotional expression of a person alongside the e-mail text message (see emotemail.media.mit.edu).
“People can use it to communicate some extra information,” he says. The system uses a camera to capture visual cues in the writer’s face. “As you are typing, each paragraph embeds a picture of your face next to the text so [the receiver of the message] can see whether you are smiling or scowling.”
In addition, portions of the background of the text are shaded to show the relative amount of time spent typing each paragraph. “The idea is that typing speech may have to do with someone’s emotional state. If you spent a long time typing, that might have some meaning that goes over and above semantic meaning.”
Another M.I.T. project is a glove that acts as a skin conductivity sensor, reflecting the wearer’s emotional state as it is observed in his galvanic skin responses.
“There are physiological connections between sweat glands and psychological activity,” Mr. Reynolds notes. “If someone is in a high arousal state, the galvanic skin response is known to go up.”
By Ted Cruz
Israel saves its enemies; Hamas endangers its friends
- Inside the Ring: Israel surprised by Hamas tunnel network
- Army's 3-D printed bombs to create 'a whole new universe' of lethal capabilities
- GOP leaders delay border bill, leave Obama in control
- Chicken pox outbreak puts illegal immigrant facility on lockdown
- CIA admits improperly hacking Senate computers in search of Bush-era information
- CRUZ: A tale of two hospitals: One in Israel, one in Gaza
- Report: 40% of weapons sent to Afghanistan are unaccounted for
- 3 African leaders cancel trip to U.S. over Ebola outbreak; Obama still plans summit
- Israel surprised by Hamas tunnel network
- Colorado poll shows women tuning out Democrats' 'war on women' strategy
Obama's biggest White House 'fails'
Celebrities turned politicians
Athletes turned actors
20 gadgets that changed the world