Emotion-sensing computer software that models and responds to students'
cognitive and emotional states -- including frustration and boredom --
has been developed by University of Notre Dame Assistant Professor of
Psychology Sidney D'Mello, Art Graesser from the University of Memphis
and a colleague from Massachusetts Institute of Technology. D'Mello also
is a concurrent assistant professor of computer science and
engineering.
The new technology, which matches the interaction of human tutors, not only offers tremendous learning possibilities for students, but also redefines human-computer interaction.
"AutoTutor" and "Affective AutoTutor" can gauge the student's level of knowledge by asking probing questions; analyzing the student's responses to those questions; proactively identifying and correcting misconceptions; responding to the student's own questions, gripes and comments; and even sensing a student's frustration or boredom through facial expression and body posture and dynamically changing its strategies to help the student conquer those negative emotions.
"Most of the 20th-century systems required humans to communicate with computers through windows, icons, menus and pointing devices," says D'Mello, who specializes in human-computer interaction and artificial intelligence in education.
"But humans have always communicated with each other through speech and a host of nonverbal cues such as facial expressions, eye contact, posture and gesture. In addition to enhancing the content of the message, the new technology provides information regarding the cognitive states, motivation levels and social dynamics of the students."
Read More:Science Daily
The new technology, which matches the interaction of human tutors, not only offers tremendous learning possibilities for students, but also redefines human-computer interaction.
"AutoTutor" and "Affective AutoTutor" can gauge the student's level of knowledge by asking probing questions; analyzing the student's responses to those questions; proactively identifying and correcting misconceptions; responding to the student's own questions, gripes and comments; and even sensing a student's frustration or boredom through facial expression and body posture and dynamically changing its strategies to help the student conquer those negative emotions.
"Most of the 20th-century systems required humans to communicate with computers through windows, icons, menus and pointing devices," says D'Mello, who specializes in human-computer interaction and artificial intelligence in education.
"But humans have always communicated with each other through speech and a host of nonverbal cues such as facial expressions, eye contact, posture and gesture. In addition to enhancing the content of the message, the new technology provides information regarding the cognitive states, motivation levels and social dynamics of the students."
Read More:Science Daily
0 comments:
Post a Comment
Grace A Comment!