People talking back to a computer is common enough -- usually in a moment of pique or frustration. Getting the computer to respond in kind is a far different task, one that computer scientists are undertaking with various degrees of success and consternation.
The challenge isn't simply a matter of inventing new software and sometimes hardware, difficult enough as that is, but also of coming to grips with some of the ethics involved.
If computers are to have emotional components, what role would they play in everyday life? Do human beings really want an emotional relationship with a mechanical mind?
The field is called "affective technology." Machine prototypes exist that measure human emotional expression through physiological signals such as facial expressions and voice changes and allow a humanlike response, as described in papers and lectures prepared by Rosalind W. Picard, founder and director of the Media Laboratory's Affective Computing Research Group at the Massachusetts Institute of Technology. She spoke about her group's work in May at the American Association for the Advancement of Science in Washington.
The term "affective technology" has different meanings for different groups around the country doing research on human interaction with computers. Graduate student Kirsten Boehner of Cornell University's Human-Computer Interaction Group, for instance, works on how computers can "cause me to reflect on my own emotion," rather than on how a computer can imitate emotion.
AT&T Research Labs in New Jersey, which pioneered speech recognition systems, has yet to attempt incorporating emotion detection, according to AT&T Labs researcher Rich Cox. But he definitely sees the potential in what he calls "auditory cues in voices that would allow you to detect different kinds of emotion. ... Knowing the emotion of the person on the other end [of a conversation] who may help the machine accomplish its task -- depending on the task."
Some of the most mind-bending research under way at M.I.T. focuses on how computers can be made capable of copying certain human skills.
"We're able to make good guesses -- educated guesses -- about someone's emotional state and in some cases approach what humans are able to do," says M.I.T. graduate student Carson Reynolds, citing research that scientists and mathematicians have done to classify the emotional information that might be found in human speech.
"A tricky thing to note is that humans don't detect each others' emotions with perfect accuracy," he says.
He has worked on a software program called Emotemail that communicates the emotional expression of a person alongside the e-mail text message (see emotemail.media.mit.edu).
"People can use it to communicate some extra information," he says. The system uses a camera to capture visual cues in the writer's face. "As you are typing, each paragraph embeds a picture of your face next to the text so [the receiver of the message] can see whether you are smiling or scowling."
In addition, portions of the background of the text are shaded to show the relative amount of time spent typing each paragraph. "The idea is that typing speech may have to do with someone's emotional state. If you spent a long time typing, that might have some meaning that goes over and above semantic meaning."
Another M.I.T. project is a glove that acts as a skin conductivity sensor, reflecting the wearer's emotional state as it is observed in his galvanic skin responses.
"There are physiological connections between sweat glands and psychological activity," Mr. Reynolds notes. "If someone is in a high arousal state, the galvanic skin response is known to go up."
The various signals generated are studied by mathematicians and scientists who create a software program based on patterns that relate to different emotional states.
"At some point it becomes specious to talk about a computer really seeing emotion the way a human might," Mr. Reynolds concedes. "What it is doing is taking pieces of information that have something to do with a person's state of mind and finding patterns that relate to what people are saying."
The technology that allows computers to read emotions and respond in kind is at a very primitive stage overall, says Connie Bertka, director of the AAAS Dialogue on Science, Ethics and Religion, adding that she isn't personally looking for her computer to react at any level differently from the way it does now.
"It's fun and interesting to write about, but in terms of implications, there are bigger issues. If computers can provide emotional responses, will people get attached to them?" she asks.
"If my 5-year-old loses too many games of tic-tac-toe in a row, and I sense her frustration, I might decide to let her win one. Maybe the computer will be able to do the same for me. But do I want it to be able to make that choice?"
Computers don't have emotional intelligence yet, in the sense of being able to express emotion intelligently, points out Ms. Picard, who wrote at length on the subject in a 1996 MITPress book called "HAL's Legacy: 2001's Computer As Dream and Reality." HAL, of course, was the anthropomorphic computer in Stanley Kubrick's 1968 movie "2001: A Space Odyssey."
Ms. Picard is especially interested in finding ways the technology could help children overcome frustrations in the learning process -- using the computer almost as a companion to work alongside the child who is attempting to process a great deal of information at once. Another special area of interest for her is how such research can be applied in the health field.
Both involve developing systems that sense and respond to an individual's affective state, she says.
"Children in learning situations may be frustrated and bored and not want to continue. How then to intervene and help them to be better learners and love learning?" she wonders.
Likewise, she says, a lot of people struggling with drug and other substance abuse feel frustrated in times of stress and boredom that are similar to learning situations and might be similarly helped.
"I imagine someone in the future who is overeating and whose problem is stress. They say, 'I need comfort food' but are annoyed by the habit and would do anything to stop it. They have chosen to try to change their behavior -- and this is an important marker.
"Maybe they would have a cell phone that would sense their stress and give them a ring from someone like a family member or therapist reminding them they don't have to eat and potentially be more effective than one hour every other week. It would be like carrying around a trained psychiatrist."