<?xml version="1.0"?>
<News hasArchived="false" page="1" pageCount="1" pageSize="10" timestamp="Sun, 26 Apr 2026 23:40:31 -0400" url="https://beta.my.umbc.edu/groups/umbc-ai/posts.xml?tag=human-robot-interaction">
<NewsItem contentIssues="false" id="138871" important="false" status="posted" url="https://beta.my.umbc.edu/groups/umbc-ai/posts/138871">
<Title>Emotion Recognition for Human-Robot Interaction</Title>
<Tagline>via multimodal fusion and deep learning</Tagline>
<Body>
<![CDATA[
    <div class="html-content"><div>UMBC Ph.D. student <a href="https://www.linkedin.com/in/farshad-safavi/" rel="nofollow external" class="bo">Farshad Safavi</a> presented his dissertation proposal on recognizing emotions for better human-robot interaction on February 9, 2024.</div><div><br></div><div><strong>Emotion Recognition via Multimodal Fusion for Human-Robot Interaction Using Deep Learning</strong></div><div><br></div><div>One of the primary challenges in Human-Robot Interaction (HRI) is enabling robots to effectively understand and respond to human emotions. Humans express emotions through verbal and non-verbal cues, while robots typically rely on pre-programmed algorithms and physical gestures. Our research aims to develop HRI that bridges this gap by leveraging multimodal emotion detection. Emotions play a crucial role in human communication and decision-making, significantly influencing human-robot interactions. We aim for robots to understand and respond to human emotions by integrating neurophysiological and behavioral channels. Initially, we examine unimodal facial expression recognition using Convolutional Neural Networks (CNN) and Vision Transformers (ViT). Next, we enhance the model with a Mixture of Transformers (MiT). Using this enhanced model, we have developed a human-robot interaction perception system. Subsequently, we investigate multimodal emotion recognition in conveying emotions in Human-Robot Interaction (HRI). While unimodal techniques have been used to recognize emotions from various sources, research indicates that emotion recognition is inherently multimodal. Fusion representations provide a more comprehensive view of the emotional state, thereby enhancing emotion recognition accuracy. Therefore, exploring the role of multimodal fusion through computational models and neurophysiological experiments is essential. Our framework uses machine learning and deep learning to interpret complex physiological and facial expression data, enabling nuanced human-robot interactions. We focus on the offline fusion of multimodal methods, combining brain and behavior models, and exploring real-time fusion solutions. These human-robot interactions, based on emotions, will be validated through neurophysiological experiments, aiming for seamless and intuitive interactions based on a thorough understanding of human emotions.</div><div><br></div><div>Committee: Drs. <a href="https://www.csee.umbc.edu/ramana-vinjamuri/" rel="nofollow external" class="bo">Ramana Kumar Vinjamuri</a> (Chair/Advisor), <a href="https://redirect.cs.umbc.edu/~adali/" rel="nofollow external" class="bo">Tulay Adali</a>, <a href="https://redirect.cs.umbc.edu/~nilanb/index.html" rel="nofollow external" class="bo">Nilanjan Banerjee</a>, <a href="https://www.csee.umbc.edu/people/faculty/justin-brooks/" rel="nofollow external" class="bo">Justin Brooks</a>, <a href="https://www.linkedin.com/in/scott-kerick-4a0b2211/" rel="nofollow external" class="bo">Scott Kerick</a></div></div>
]]>
</Body>
<Summary>UMBC Ph.D. student Farshad Safavi presented his dissertation proposal on recognizing emotions for better human-robot interaction on February 9, 2024.     Emotion Recognition via Multimodal Fusion...</Summary>
<TrackingUrl>https://beta.my.umbc.edu/api/v0/pixel/news/138871/guest@my.umbc.edu/01bfb71f306d8ab5651b90138a78fcf8/api/pixel</TrackingUrl>
<Tag>ai</Tag>
<Tag>dissertation</Tag>
<Tag>human-robot-interaction</Tag>
<Tag>robotics</Tag>
<Group token="umbc-ai">UMBC AI</Group>
<GroupUrl>https://beta.my.umbc.edu/groups/umbc-ai</GroupUrl>
<AvatarUrl>https://assets4-beta.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xsmall.png?1691095779</AvatarUrl>
<AvatarUrl size="original">https://assets2-beta.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/original.png?1691095779</AvatarUrl>
<AvatarUrl size="xxlarge">https://assets1-beta.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xxlarge.png?1691095779</AvatarUrl>
<AvatarUrl size="xlarge">https://assets1-beta.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xlarge.png?1691095779</AvatarUrl>
<AvatarUrl size="large">https://assets1-beta.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/large.png?1691095779</AvatarUrl>
<AvatarUrl size="medium">https://assets3-beta.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/medium.png?1691095779</AvatarUrl>
<AvatarUrl size="small">https://assets3-beta.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/small.png?1691095779</AvatarUrl>
<AvatarUrl size="xsmall">https://assets4-beta.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xsmall.png?1691095779</AvatarUrl>
<AvatarUrl size="xxsmall">https://assets1-beta.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xxsmall.png?1691095779</AvatarUrl>
<Sponsor>UMBC AI</Sponsor>
<PawCount>0</PawCount>
<CommentCount>0</CommentCount>
<CommentsAllowed>true</CommentsAllowed>
<PostedAt>Sat, 10 Feb 2024 10:15:10 -0500</PostedAt>
<EditAt>Tue, 27 Feb 2024 17:38:35 -0500</EditAt>
</NewsItem>

</News>
