For the first time, scientists at Carnegie Mellon University have identified which emotion a person is experiencing based on brain activity.
The study, published in the June 19 issue of PLOS ONE, combines functional magnetic resonance imaging (fMRI) and machine learning to measure brain signals to accurately read emotions in individuals. Led by researchers in CMU’s Dietrich College of Humanities and Social Sciences, the findings illustrate how the brain categorizes feelings, giving researchers the first reliable process to analyze emotions. Until now, research on emotions has been long stymied by the lack of reliable methods to evaluate them, mostly because people are often reluctant to honestly report their feelings. Further complicating matters is that many emotional responses may not be consciously experienced.
Identifying emotions based on neural activity builds on previous discoveries by CMU’s Marcel Just and Tom M. Mitchell, which used similar techniques to create a computational model that identifies individuals’ thoughts of concrete objects, often dubbed “mind reading.”
“This research introduces a new method with potential to identify emotions without relying on people’s ability to self-report,” said Karim Kassam, assistant professor of social and decision sciences and lead author of the study. “It could be used to assess an individual’s emotional response to almost any kind of stimulus, for example, a flag, a brand name or a political candidate.”
sharing - the new learning - wait isn't that what kids and adults and babies have always done to learn? Until we suggested that 'sharing' might be called 'cheating' when testing became the 'thing' we felt we needed to do...and consequently ended up driving assessing to the top of the learning agenda instead of a consequence of learning...it is now the cart that is dragging that little ole donkey of learning around.
While hospital robots sound like the stuff of the future, the technology is already in wide use today.
If you’ve been waiting for the day when robot doctors will cut you open, monitor you recovery, and keep you company in your hospital room, you won’t have to wait much longer.
“We’re in the first inning of a nine-inning exercise. The average patient walks in a hospital and is not touched by robotics. That’s going to change in 10 years,” said John Simon, a partner at Boston-based investment firm Sigma Prime Ventures.
That adoption rate, Simon argues, is based on cost: As the price of robotics adoption decreases, hospitals may be more likely to invest in new technology. At their core, robots aren’t all that different from any other hospital gear.
The problem for hospitals, however, is that there’s a danger in pursuing robotics too far. “With medical robots, if you automate something too much, people won’t accept it,” Simon said.
This results in a fine line that hospitals and doctors must manage. While some automation and robotics is good, the last thing a hospital wants to do is embrace robots to such an extent that they alienate patients.
Little of that, however, is on the minds of hospitals today. Right now, most of them are just trying to figure out how to get robots in the front door. Here are a few ways robots are changing hospitals today.