- Gentlemen, start your drones: Judge’s ruling opens door for commercial use
- Soldier who hid, bragged about not saluting flag to be punished — in secret
- ‘Maverick’ of the seas: ‘Top Gun’ school for U.S. ship officers to launch
- Putin declares Sochi Paralympics open amid Ukrainian protest
- ‘In Jesus name, we pray’ sparks ire at Ohio council meeting
- Navy’s first laser weapon ready for prime time; drone killer to deploy this summer
- Billionaire backer: Rick Santorum ‘needs to be heard’ in 2016
- Obamacare fallout: 49 percent pessimistic; 45 percent ‘scared’
- DHS accused of holding U.S. citizen at airport, using emails to pry into her sex life
- Seattle socialist: Minimum-wage discussion skewed by ‘right-wing’ GAO analysis
People talking back to a computer is common enough — usually in a moment of pique or frustration. Getting the computer to respond in kind is a far different task, one that computer scientists are undertaking with various degrees of success and consternation.
The challenge isn’t simply a matter of inventing new software and sometimes hardware, difficult enough as that is, but also of coming to grips with some of the ethics involved.
If computers are to have emotional components, what role would they play in everyday life? Do human beings really want an emotional relationship with a mechanical mind?
The field is called “affective technology.” Machine prototypes exist that measure human emotional expression through physiological signals such as facial expressions and voice changes and allow a humanlike response, as described in papers and lectures prepared by Rosalind W. Picard, founder and director of the Media Laboratory’s Affective Computing Research Group at the Massachusetts Institute of Technology. She spoke about her group’s work in May at the American Association for the Advancement of Science in Washington.
The term “affective technology” has different meanings for different groups around the country doing research on human interaction with computers. Graduate student Kirsten Boehner of Cornell University’s Human-Computer Interaction Group, for instance, works on how computers can “cause me to reflect on my own emotion,” rather than on how a computer can imitate emotion.
AT&T; Research Labs in New Jersey, which pioneered speech recognition systems, has yet to attempt incorporating emotion detection, according to AT&T; Labs researcher Rich Cox. But he definitely sees the potential in what he calls “auditory cues in voices that would allow you to detect different kinds of emotion. … Knowing the emotion of the person on the other end [of a conversation] who may help the machine accomplish its task — depending on the task.”
Some of the most mind-bending research under way at M.I.T. focuses on how computers can be made capable of copying certain human skills.
“We’re able to make good guesses — educated guesses — about someone’s emotional state and in some cases approach what humans are able to do,” says M.I.T. graduate student Carson Reynolds, citing research that scientists and mathematicians have done to classify the emotional information that might be found in human speech.
“A tricky thing to note is that humans don’t detect each others’ emotions with perfect accuracy,” he says.
He has worked on a software program called Emotemail that communicates the emotional expression of a person alongside the e-mail text message (see emotemail.media.mit.edu).
“People can use it to communicate some extra information,” he says. The system uses a camera to capture visual cues in the writer’s face. “As you are typing, each paragraph embeds a picture of your face next to the text so [the receiver of the message] can see whether you are smiling or scowling.”
In addition, portions of the background of the text are shaded to show the relative amount of time spent typing each paragraph. “The idea is that typing speech may have to do with someone’s emotional state. If you spent a long time typing, that might have some meaning that goes over and above semantic meaning.”
Another M.I.T. project is a glove that acts as a skin conductivity sensor, reflecting the wearer’s emotional state as it is observed in his galvanic skin responses.
“There are physiological connections between sweat glands and psychological activity,” Mr. Reynolds notes. “If someone is in a high arousal state, the galvanic skin response is known to go up.”
Taxpayers must pay the freight for over-budget train projects
- CPAC 2014: Rand Paul urges conservatives to fight for liberty
- Putin has transformed Russian army into a lean, mean fighting machine
- Kim Jong-un calls for execution of 33 Christians
- EDITORIAL: Connecticut revolts against gun controls that could criminalize 300,000
- Bill Clinton poses for photo with Bunny Ranch prostitutes
- U.S. pilot scares off Iranians with 'Top Gun'-worthy stunt: 'You really ought to go home'
- Protests in Russia against Putin's actions in Ukraine a shift in attitudes
- Russian lawmaker wants to outlaw U.S. dollar, calls it a Ponzi scheme
- Aronofsky's 'Noah' banned in Qatar, Bahrain, United Arab Emirates
- High schooler suing parents for money shot down by judge
Pope Francis meets his 'mini-me'
Celebrity deaths in 2014
Winter storm hits states — again