Stanford University

News Service


NEWS RELEASE

01/06/95

CONTACT: Stanford University News Service (415) 723-2558

Social science research influences computer product design

STANFORD -- A new home computer product to be introduced with fanfare at the Consumer Electronics Show in Las Vegas Saturday, Jan. 7, is based on research on human-computer interaction conducted at Stanford's Center for the Study of Language and Information.

"Microsoft's new Bob home computer program is an example of how formerly arcane knowledge about human behavior has become as relevant as computer science to the communication technology marketplace," said John Perry, director of CSLI. The 12- year-old Stanford center does research in the related fields of information, computing and cognition.

"The interface between humans and computers is where the action in computers is now, and so research on how people think and behave is becoming hot stuff," Perry said.

Two social scientists, Clifford Nass and Byron Reeves, professors in the Communication Department, provided their theories and research results to Microsoft Corp.'s "social interface" program designers. The program's first product, called Bob, is to be introduced Saturday, Jan. 7 by Microsoft chairman and CEO Bill Gates at the Consumer Electronics Show in Las Vegas. Reeves and Nass are currently serving as consultants to Microsoft.

Their research can be applied, however, to other forms of information technology, including voicemail and interactive television.

"Nass' and Reeves' work considers to what extent people react to technology as if it were more real than it is," Perry said. "They have found that to a very considerable extent people treat their computers and other computer-driven technology in the same ways that they treat people - as if the computer possessed reason, feelings, etc. People also treat pictures on screens as real objects, rather than as representations of real objects. This is relevant to anyone who wants to design technology or content that is as effective as it can be," Perry said.

This work can also be controversial, Nass said. For example, some women have complained about findings, in his and Reeves' experiments with computer voices, that people are prone to gender stereotyping in voice-based technologies. "Female voices are perceived as less effective evaluators and more nurturing than are male-voiced systems. Female voiced computers are perceived as better teachers of love and relationships and worse teachers of technical subjects than are male-voiced teaching systems," the two reported in CSLI's annual research report.

"We are not supporting gender stereotyping but we are identifying something that people designing products should be sensitive about," Nass said. "It's an important finding also, because it says that you can't blame women for gender stereotyping because of the way they dress and behave. Here is a black box that doesn't dress or behave differently than men, and it still gets gender stereotyped."

Other companies that have supported Nass and Reeves' inquiries into the human-computer interface include Apple, IBM, Hewlett-Packard, U.S. West, Bell Northern and Compaq. These and others companies also support other CSLI research as industrial affiliates, Perry said. The center's work on computers that can take speech dictation is being marketed for people who are unable to use keyboards by Audion, a small company that expects the demand to grow both as prevention and treatment for carpel tunnel syndrome, he said.

Some companies interested in automatic translation are following research on speech disfluencies by Herbert Clark and Thomas Wasow. Disfluencies are utterances like "ah" and "um" which turn out to be important ways speakers signal meaning to their listeners. "Ah" is generally used to signal a pause while the speaker searches his or her mind for a word, while "um" signals a longer pause, while the speaker searches for an idea, Perry said. This finding applies to communication technology development, he said, in that it helps explain people's frustration when computers don't signal that they are processing information and haven't just quit working altogether.

Still other work by center researchers focuses on applying this type of research to the current and future capabilities of digital technology. "We have gone from thinking of computers as tools to computers as people's partners or agents," Perry said. "They don't just give you information but should help you act on it, taking into account your preferences, desires and goals."

Research by computer scientist Yoav Shoham involves getting computers to behave like agents or partners rather than tools, Perry said. Nass and Reeves¹ work "implies that people will naturally treat the technology as agents, not as tools. Companies can ignore this reality at their peril or try to figure out ways to exploit it which is Shoham's side of this," Perry said.

Exploiting it, from Reeves¹ point of view, means computers must follow basic social rules to be well accepted. "People are strongly biased towards using social rules, such as politeness, to guide their behavior towards communication technology," Reeves said.

Microsoft's Bob is one attempt to exploit human desire for socially competent technology, he said, by incorporating many features of "natural environments and social relations."

"The question for Microsoft was how to make a computing product easier to use and fun. Cliff and I gave a talk in December 1992 and said that they should make it social and natural,² Reeves said. ³We said that people are good at having social relations - talking with each other and interpreting cues such as facial expressions. They are also good at dealing with a natural environment such as the movement of objects and people in rooms, so if an interface can interact with the user to take advantage of these human talents, then you might not need a manual."

Bob does not have a manual, but it includes personal characters - human-like animals or what researchers call "agents" - who converse with the computer's user about doing jobs, such as writing a letter, sending electronic mail or paying bills with the computer. The user can choose from about a dozen characters as a guide.

"A strong willed computer user might want a fast-paced character who is going to operate at his or her speed and not say 'perhaps now we should think about...', but other people like softer, friendlier personalities" to help them navigate through the computer environment, which is displayed on the screen as rooms in a house. Reeves said. There is also an "irreverent" rat named Scuzz designed to appeal mostly to adolescent males. Scuzz bangs out a loud chord on his guitar when the printer runs out of paper.

"We've found that dominant people like dominant characters, that people smile when characters on the screen smile at them, and that they take offense when the computer breaks rules of politeness. Every character needs a way to say hello and good-bye, to get your attention, to say you've done a good job."

Two "open questions," Reeves said, are whether people will find such programs boring after awhile, and whether they will pay off in business environments by reducing what have become enormous training costs related to new information technology.

More human-like computer technology costs more in terms of memory and operating capacity, Perry said, so there are many trade-offs to consider.

"I think this project does demonstrate, however, that basic research and expertise in fields besides computer science are becoming very important to companies developing information technology."

-kpo-

950106Arc5423.html


This is an archived release.

This release is not available in any other form. Images mentioned in this release are not available online.
Stanford News Service has an extensive library of images, some of which may be available to you online. Direct your request by EMail to images@news-service.stanford.edu.

© Stanford University. All Rights Reserved. Stanford, CA 94305. (650) 723-2300. Terms of Use | Copyright Complaints