It wasn’t your usual scientific research presentation. Two dancers—one representing a robot and the other a human—take turns moving around each other. As the dance progresses the human is at first fearful, then curious, and finally happy.
The performance in June during the Movement, Music, and Brain Health National Science Foundation (NSF) AccelNet meeting on the UMBC campus was the brainchild of three UMBC faculty who have joined forces to explore whether and how dancing robots might offer humans new tools to improve their mental health. The research piggybacks off established practices of human-to-human dance/movement therapy, which can be used to treat some mental health challenges, such as schizophrenia, anxiety and depression.
The exact form that robotic dance therapy might take, and the range of mental health conditions it could treat, are still large open-ended questions for the team, which is led by Ramana Vinjamuri, an associate professor in computer science and electrical engineering, who has done extensive work in brain-computer interfaces, and Andrea Kleinsmith, an associate professor in information systems, who specializes in ways that computers can assess humans’ emotions.
“As a healthcare opportunity, dancing with a robot may sound weird at first,” says Ann Sofie Clemmensen, an associate professor of dance, who is also part of the interdisciplinary team. “Why not just dance with a human?” But, she says, people who are socially isolated or struggle with the stressors of human interactions might benefit from robot partners. “As humans we project emotions on objects, but the objects do not judge back,” she says.
(l-r): Ramana Vinjamuri, Andrea Kleinsmith, and Ann Sofie Clemmensen are collaborating on a project to explore a possible role for robots in dance therapy. (Photos courtesy of Vinjamuri, Kleinsmith, and Clemmensen)
“The most exciting thing about this project for me is the collaboration,” says Vinjamuri. “I’ve never done something like this, and so the possibility to bring these fields together to tackle an important issue like mental health is super exciting.”
First steps
The groundwork for the research was laid as part of over a decade of work in Vinjamuri’s lab searching for “alphabets” or “synergies” of hand movements and associated brain activity that combine to build the variety of our everyday movements. Vinjamuri’s Ph.D. student Parthan Olikkal had recently developed contactless human motion tracking methods, which he applied when teaching humanoid robots these alphabets to form new movements.
Against this backdrop, the spark for the interdisciplinary venture was struck when the College of Engineering and Information Technology (COEIT) launched a “COEIT Interdisciplinary Projects” program to encourage faculty to explore collaborations across disciplines to tackle big challenges. Vinjamuri reached out to Kleinsmith and Clemmensen to discuss the possibility of teaming up.
Together, the researchers developed a project proposal to study key questions surrounding the idea of robot-assisted dance therapy. They named the proposal SIVAM after the Indian mythological god of dance (also short for “Synergy-based, Intuitive, Virtual and Augmented therapy for Mental health”). The research would look into questions such as whether the coordination in a person’s arms and legs could be a proxy measure of mental well-being, how existing dance therapy movements affect brain activity, and how a humanoid robot dance partner compares in effectiveness to a flesh-and-blood one.
Creative solutions at the technological frontiers
Like any big endeavor, the project encountered unexpected hurdles. An existing robot that the team had couldn’t move fast enough or with the full range of motion needed for a dance partner. (A new robot will soon be ordered.) The team also had to wait for delivery of a special EEG cap that could measure a dancer’s brain activity without the typical gel and wires that would get in the way. The cap was also equipped to filter out the signal noise that comes from a person moving around.
When the team realized they would have to wait for the humanoid robot, they pivoted to developing a digital avatar. They designed a camera and software system to track a person’s motions and then created a digital representation of a person to mirror the movements back, a technique in dance/movement therapy.
Developing the motion tracking system was a big part of the project to date. “Even just a few years ago, it was so much more difficult to digitally capture a person’s movements without them wearing reflective markers that a camera can easily track,” says Kleinsmith. Now, the team is using the latest in computer vision and machine learning tools to implement a markerless tracking system. Eliminating the need for specialized attire should make the system more accessible and useful.
The team also laid the groundwork for the next stages of the project by testing sensors, including the new cap and wireless sensors that can measure physiological signals such as heart rate, skin conductance, and body temperature. All the equipment will help the team test novel ways of assessing, and perhaps ultimately altering, human subjects’ emotional states.
“If you tighten your body, that may mean anger or fright, if you are more loose, you are more relaxed,” says Clemmensen. “And it’s possible that you could then guide a person through movement into that emotional state. The next part of this research is to get the data on that, and I’m quite excited about it.”
A technology-infused stage debut
The June performance was a chance for the team to creatively demonstrate their progress to brain researchers and artists from around the world.
In the first half of the performance, the human dancer, performed by UMBC graduate Juju Ayoub ’25, dance, and a “robot” dancer, performed by Sarah McHale ’24, dance, sit opposite one another and take turns moving. Their movements are captured and displayed on a large screen by digital avatars. In the second half, the human and robot meet on the dance floor, while the human cycles through the emotions of fear, curiosity, and happiness. Sensors on Ayoub measured her brain activity, heart rate, and other signals that capture emotions, and displayed them on the screen. The second half of the performance was improvised by the dancers, within an accumulative structure provided by Clemmensen.
On left, dancers Juju Ayoub and Sarah McHale get ready to perform while Ph.D. student Parthan Olikkal sets up equipment. On right, Sarah McHale dances in front of the digital avatars. (Photos by Kiirstn Pagan ’11)
“Philosophically speaking, the first part of the performance represents humans and robots working in their own spaces. Part two is where they’re trying to work together, going through these phases of fear, curiosity, and then finally collaboration—and hopefully a happy collaboration,” says Vinjamuri.
The human researchers on the project have certainly found their own happy collaboration.
Clemmensen said she appreciated how the group’s focus could zoom out and in, transitioning from discussions of big ideas to tackling tricky troubleshooting for one piece of equipment.
“I would like to see if I can take that verbal process into the creative space of dance choreography too,” she says.
The students involved in the project—Olikkal, fellow Ph.D. students Sruthi Sundharram and Golnaz Moharrer, and undergraduates Oritsejolomisan Mebaghanje ’25, computer science, and first-year computer science student Viraj Janeja—agree it was a mind-stretching and rewarding experience.
“I was very excited to be involved in the performance, which was an unusual and creative experience,” says Sundharram, who is a first-year Ph.D. student in computer science in Vinjamuri’s lab and who helped set up and connect the cap and sensors before the dance. “It was nerve-racking right before the start, fearing that something wouldn’t work,” Sundharram laughed. But the dancers helped ease her jitters and the performance went well.
“The best part of the experience for me was seeing the virtual environment for the project come alive,” says Mebaghanje, who worked as the lead software developer on the project. “I also really enjoyed working with my team and debugging issues together.”
Olikkal, who has been involved in the project from the beginning, and who worked primarily on the motion capture system, says he’s been able to hone his career aspirations in a meaningful way after joining Vinjamuri’s lab in 2019 as a master’s student.
“Once I started really putting my heart into the research and seeing how these systems can help people, maybe not always immediately but certainly down the line, I felt like I had found my calling,” he says.
After the dancers exited the stage of the Fine Arts Recital Hall, Vinjamuri took the microphone to thank the whole team. And he hinted at the exciting work that lies ahead: “Maybe next time there will be a real robot on stage.”