16 diciembre 2009

Androides y posthumanos.


Algo más sobre ese cruce de caminos entre antropología y robotica .




Learning to love to hate robots

ROBOTIC helpers are not yet in every home. But in recent years robots have steadily marched into the real world to perform tasks such as cleaning floors, delivering drugs or simply entertaining.
That has let anthropologists and roboticists give these mechanical workers their first report cards - and results are mixed. Despite evidence that we can find robots useful, even lovable colleagues, they can also trigger annoyance and violence. The results should help make future robots easier to work with.
One study by Jodi Forlizzi at Carnegie Mellon University in Pittsburgh, Pennsylvania, highlights how popular culture can affect a robot's reception.
People she introduced to Roomba, a robotic vacuum cleaner made by iRobot of Bedford, Massachusetts, compared it with their knowledge of robots that explore Mars, forming low expectations of Roomba's abilities. But making a bad first impression seemed to help Roomba; it invariably surpassed expectations, helping people bond with their machine.
A six-month study of how Roomba affected households, conducted by Ja-Young Sung at the Georgia Institute of Technology in Atlanta, backs up that finding.
"Some people saw it as a lifetime partner - they had a real emotional attachment to it." Even those who returned to their previous cleaning routine didn't blame the robot, instead saying it was their routine that was at fault.
Yet not all robots have fared so well. A handful of hospitals across the US use a robot called TUG, made by Pittsburgh-based firm Aethon, to carry drugs between wards.
When Bilge Mutlu at the University of Wisconsin, Madison, studied how hospital workers got along with TUG, which navigates hallways and uses elevators, he found it polarised opinion. Some saw it as a team player and loved it; others thought it was attention-seeking and resented it (see "Love, hate and a hospital robot").
These studies may be small scale, but they help illuminate the skills robots need if they are to get along with humans. The designers of the few robots to have reached the real world and work alongside people are breaking new ground, just not always successfully.
Maria Håkansson at the Swedish Institute of Computer Science in Stockholm recently ran a 10-month project to see how six families got on with the Pleo robotic dinosaur, made by Innvo Labs in Nevada.
Pleo's impressive behaviours, like responding to sounds or touch, led to initially positive reactions. But niggles like battery life, and a perception that it wasn't actually as capable of developing a personality as claimed on the box undermined those impressions, says Håkansson. Frustrated by seeing no evidence of learning, one person declared: "Pleo was just as stupid as a week ago."
Robots that come across as sophisticated and smart, unlike Roomba, are in trouble if they are seen not to deliver.
Back in the lab, some roboticists think they have a solution, says Kathleen Richardson, an anthropologist studying robots at University College London. "[Researchers] design lots of robots to look like children so that people will imagine they have a more childlike mentality," she says. In this way they attempt to guide those crucial first impressions and induce people to be more forgiving of mistakes.
Similarly, it has also been assumed that a functional appearance affords protection from high expectations and later disappointment, something that studies of Roomba back up.
Still, Mutlu's work shows a humble appearance doesn't always have that effect. Because the box-like TUG robot could navigate independently and used a human-like voice, "people expected it to show appropriate behaviours related to these qualities," Mutlu says.
A box on wheels it may be, but TUG was also a hospital worker, and its colleagues expected it to have some social smarts, the absence of which led to frustration - for example, when it always spoke in the same way in both quiet and busy situations.
Those findings aren't going unnoticed: Aethon is looking into some of the issues raised by the TUG study. "We have redesigned the cart and user interface controls, adjusted voice volumes and included the ability to make voices louder or softer at particular times of day," says Dave Wolfe, product manager at Aethon.
As more studies like these feed back to the designers of robots to work alongside humans, their products should find fewer kicks aimed in their direction. For many of those on the market now, though, the report card still reads: could do better.
I'm on the phone! If you say 'robot has arrived' one more time I'm going to kick you in your camera

Love, hate and a hospital robot

When a box-like robot called TUG went to work in local hospitals carrying drugs between wards, Bilge Mutlu of the University of Wisconsin, Madison, had a chance to see how people react to a robot colleague.
TUG, which is made by Aethon, can navigate a building's corridors and elevators on its own and tell humans it has arrived with a delivery. Mutlu found that reactions to it were strong, and mixed.
Staff in the post-natal ward loved the robot. "I think it is a delight," said one worker, while another called it "my buddy", adding, "I like him. I like him a lot."
But people on the oncology ward weren't impressed, saying that TUG was extremely annoying.
The fact that the robot couldn't tell if it was a good time to interrupt and announce its presence was a big problem for some people, as one member of the nursing staff described: "I called it nasty names and told it, 'Would you shut the hell up? Can't you see I'm on the phone? If you say "TUG has arrived" one more time I'm going to kick you in your camera.' "
Some staff members actually did lash out and kick TUG in frustration, more admitted to considering it.
The lack of any social awareness led interviewees to complain that they felt "disrespected" by the robot. "It doesn't have the manners we teach our children," said one, "I find it insulting that I stand out of the way for patients... but it just barrels right on."
Luckily for TUG, its unvarying, one-size-fits-all social skills happened to be a natural fit in the relaxed atmosphere of the post-natal ward, says Mutlu. But the same default settings were interpreted as demanding and attention-seeking on the oncology ward, which is a more stressful and busy place to work. "If you are going to design robots with human-like capabilities you have to design the appropriate social behaviour that goes along with it," Mutlu says.

No hay comentarios:

Publicar un comentario

Puedes encontrar algunos de nuestros artículos y publicaciones en:

"What is Matter? Never Mind! What is Mind? No Matter!"

Calambur citado en Toulmin, Stephen (1990), Cosmópolis. Els transfondo de la modernidad. Barcelona: Península. Pág. 207.

Síguenos en Facebook

Y en Twitter

o por correo electrónico