''' HEY HUMANS! - :
''' GO SQUARE THE CIRCLE '''
To keep human workers at ease, collaborative robots should also have an appropriate size and appearance.
Takayuki Kanda of the ATR Intelligent Robotics and Communication Laboratories in Kyoto says that collaborative, humanoid robots should generally be no longer than a six-year old, a size most adults reckon they should overpower if necessary.
Large eyes make robots seem friendlier and, crucially, more aware of their surroundings. But humanoid features can lead to problematically unrealistic expectations, says Ulrich Reiser of Fraunhofer IPA-
A manufacturing research institute in Stuttgart that makes a Euro 250,000 home-assistant robot Care-O-bot.
He notes that people tend to distrust robots with protruding sensors, ''Terminator'' like exposed cables, or a jerry-rigged, student-project look.
To interact smoothly with people, robots will also need ''social intelligence''. It turns out, for example, like people are more trusting of robots that use metaphors rather than abstract language, says Bilge Mutlu, the head of robotics laboratory at the University of Wisconsin-Madison.
He has found that robots are more persuasive when they refer to the opinions of humans and limit pauses to about a third of a second to avoild appearing confused.
Robots' gazes must also be carefully programmed lest a stare make someone uncomfortable. Timing eye contact for ''intimacy regulation'' is tricky, Dr Mutlu says, in part because gazes are also used in dialogue to seize and yield the floor.
When a person enters a room, robots inside should pause for a moment and acknowledge the newcomer, a sign of deference that puts people at ease, says the university of British Columbia's Dr Croft.
Robots also appear friendlier when their gaze follows a person's moving hands, says Maya Cakmak of Willow Garage, the California based maker of the PR2, a $ 400,000 robot skilled enough to make an omelette -albeit slowly.
It will be a decade or two at least before the descendants of PR2, Care-O-bot, and other ''home assistance'' or ''companion'' robots will be imble and intelligent enough to zip autonomously through houses performing chores.
They will need far better sensors, movement-control actuators and batteries, and much, much better smarter software.
They must also be capable of displaying empathy or they will be rejected, says Kerstin Dautenhahn, head of a ''social robotics'' team at the University of Hertfordshire in Britain.
Her team's Care-O bot robots crunch data from 60-odd household sensors that monitor door and cupboard hinges, taps, electrical appliances and so forth.
If medicine isn't taken, say, the robot may alert relatives or the hospital.
It is vital that a robot of this sort is not perceived as hostile, but as having the owner's best interest at heart.
One way to do this is to give robots a defining human trait -the ability to make mistakes.
Maha Salem, a researcher under Dr Dautenhahn, programmed a humanoid Asimo robot, made by Honda, to make occasional human mistakes such as pointing to one drawer while talking about another. When it comes to household robots, test subjects prefer those that err over infallible ones, Dr Salem says.
Another approach uses sensors to assess the state of nearby humans, so that robots can respond appropriately. With funding from the European Union, researchers are using bracelets equipped with electrodes to enable classroom robots to demonstrate whether students are confused, bored or anxious.
The robots can adapt their teaching style accordingly, says Iolanda Leite of the Institute Superior Tecnico, a Portuguese Unoversity participating in the programme, which is called EMOTE bonding ''between people and robots."
Such bonding could have some surprising uses. In experiments carried out at Yale university involving a biped humanoid caleed NAO, made by a French firm called Aldebaran Robotics, children proved to be just as willing to share secrets with it as they were with an adult.
The researcher who performed the experiments. Cindy Bethel, who is now at Mississippi State University in Starkville, has also found that children who have witnessed a crime are less likely to be misled in a forensic interview with a robot than with a human expert -even one trained to obtain testimony.
Mark Ballard of the Starkville police department, who has been working with Dr Bethel, reckons that the robots needed to conduct ''child friendly'' forensic interviews will be available by 2020.
What's next? Market research is not much good at predicting developments in the field of collaborative robots, says Bruno Bonnell of Rebolution Capital a robotics investment fund in France.
For one thing, he says, people say they want complete control over robots, but once they start using them they actually prefer them to be as autonomous as possible.
Working alongside robots changes the way people think about them, in others words.
Whether on the factory floor, at home or in the classroom, the evolving relationship between human robots will be defined by a process of collaboration.
With respectful dedication to the Students, Professors and Teachers of the world. See Ya all on !WOW! -the World Students Society Computers-Internet-Wireless:
"' The Smartest Monkey Ever "'
Good night and God bless!
SAM Daily Times - The Voice of the Voiceless
0 comments:
Post a Comment
Grace A Comment!