<?xml version="1.0"?>
<News hasArchived="false" page="1" pageCount="1" pageSize="10" timestamp="Sun, 26 Apr 2026 09:49:02 -0400" url="https://beta.my.umbc.edu/groups/umbc-ai/posts.xml?tag=computer-vision">
<NewsItem contentIssues="true" id="147361" important="false" status="posted" url="https://beta.my.umbc.edu/groups/umbc-ai/posts/147361">
<Title>Talk: Seeing Beneath the Surface: Vision-Enabled Robots for Long-term Ocean Monitoring</Title>
<Tagline>4:00&#8211;5:15pm ET Wed, Feb. 19, 2025, UMBC ITE 231 &amp; online</Tagline>
<Body>
<![CDATA[
    <div class="html-content"><h3><span>Seeing Beneath the Surface: Vision-Enabled Robots for Long-term Ocean Monitoring</span></h3><div><span><br></span></div><div><h4><a href="https://xiaominlin.github.io/" rel="nofollow external" class="bo"><strong>Xiaomin Lin</strong></a>, JHU</h4></div><h4>4–5:15pm ET Wed, Feb. 19, 2025, ITE 231, UMBC &amp; <a href="https://umbc.webex.com/meet/gokhale" rel="nofollow external" class="bo">online</a></h4><div><br></div><div>Autonomous systems operating in complex and unstructured environments, especially underwater, require robust perception, adaptive navigation, and intelligent reasoning to function effectively. However, traditional AI models often struggle in these settings due to sensory limitations, dynamic obstacles, and computational constraints. This talk highlights these challenges and presents emerging technologies in subsea sensing and low-power autonomous operation. The first part of the talk explores <strong>multimodal sensing</strong>, demonstrating how optical, acoustic, and fused modalities enhance perception in low-visibility environments. The second part introduces <strong><a href="https://en.wikipedia.org/wiki/Active_perception" rel="nofollow external" class="bo">active perception</a></strong>, where robots dynamically select the most informative viewpoints to optimize navigation and exploration. Finally, the third part discusses efficient reasoning, showcasing how compact language models enable real-time decision-making for autonomous exploration and task execution. By integrating these three pillars, this research advances the next generation of intelligent autonomous systems for underwater robotics, environmental monitoring, and beyond.</div><div><br></div><div>Dr. <a href="https://xiaominlin.github.io/" rel="nofollow external" class="bo"><strong>Xiaomin Lin</strong></a> is a Postdoctoral Researcher at Johns Hopkins University, working at the intersection of AI, robotics, and edge computing. He received his Ph.D. in Electrical and Computer Engineering from the University of Maryland, College Park, where his dissertation focused on simulation-driven learning for autonomous underwater systems. His research spans perception-driven autonomy, multi-modal sensing, and efficient AI deployment on edge devices. His work has been recognized with the Best Paper Award at IROS 2024 (Autonomous Robotic Systems in Aquaculture) and the Best Poster Award at the Maryland Robotics Center Symposium. Dr. Lin's research has been funded by USDA, ONR, and AFRL, and he actively collaborates with academia and industry to push the boundaries of subsea autonomy.</div>
    <hr><a href="https://ai.umbc.edu/" rel="nofollow external" class="bo"><strong>UMBC Center for AI</strong></a></div>
]]>
</Body>
<Summary>Seeing Beneath the Surface: Vision-Enabled Robots for Long-term Ocean Monitoring      Xiaomin Lin, JHU   4–5:15pm ET Wed, Feb. 19, 2025, ITE 231, UMBC &amp; online     Autonomous systems operating...</Summary>
<Website>https://www.tejasgokhale.com/seminar.html</Website>
<TrackingUrl>https://beta.my.umbc.edu/api/v0/pixel/news/147361/guest@my.umbc.edu/0103990caf9e49b065d6af5a880dfbad/api/pixel</TrackingUrl>
<Tag>active-perception</Tag>
<Tag>ai</Tag>
<Tag>computer-vision</Tag>
<Tag>multimodal</Tag>
<Tag>robot</Tag>
<Tag>robotics</Tag>
<Tag>talk</Tag>
<Tag>vision</Tag>
<Group token="umbc-ai">UMBC AI</Group>
<GroupUrl>https://beta.my.umbc.edu/groups/umbc-ai</GroupUrl>
<AvatarUrl>https://assets4-beta.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xsmall.png?1691095779</AvatarUrl>
<AvatarUrl size="original">https://assets2-beta.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/original.png?1691095779</AvatarUrl>
<AvatarUrl size="xxlarge">https://assets1-beta.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xxlarge.png?1691095779</AvatarUrl>
<AvatarUrl size="xlarge">https://assets1-beta.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xlarge.png?1691095779</AvatarUrl>
<AvatarUrl size="large">https://assets1-beta.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/large.png?1691095779</AvatarUrl>
<AvatarUrl size="medium">https://assets3-beta.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/medium.png?1691095779</AvatarUrl>
<AvatarUrl size="small">https://assets3-beta.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/small.png?1691095779</AvatarUrl>
<AvatarUrl size="xsmall">https://assets4-beta.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xsmall.png?1691095779</AvatarUrl>
<AvatarUrl size="xxsmall">https://assets1-beta.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xxsmall.png?1691095779</AvatarUrl>
<Sponsor>Advances in Perception, Prediction, and Reasoning Lab</Sponsor>
<ThumbnailUrl size="xxlarge">https://assets1-beta.my.umbc.edu/system/shared/thumbnails/news/000/147/361/e9801948a3571637871183d7091368ae/xxlarge.jpg?1739638480</ThumbnailUrl>
<ThumbnailUrl size="xlarge">https://assets2-beta.my.umbc.edu/system/shared/thumbnails/news/000/147/361/e9801948a3571637871183d7091368ae/xlarge.jpg?1739638480</ThumbnailUrl>
<ThumbnailUrl size="large">https://assets2-beta.my.umbc.edu/system/shared/thumbnails/news/000/147/361/e9801948a3571637871183d7091368ae/large.jpg?1739638480</ThumbnailUrl>
<ThumbnailUrl size="medium">https://assets1-beta.my.umbc.edu/system/shared/thumbnails/news/000/147/361/e9801948a3571637871183d7091368ae/medium.jpg?1739638480</ThumbnailUrl>
<ThumbnailUrl size="small">https://assets4-beta.my.umbc.edu/system/shared/thumbnails/news/000/147/361/e9801948a3571637871183d7091368ae/small.jpg?1739638480</ThumbnailUrl>
<ThumbnailUrl size="xsmall">https://assets1-beta.my.umbc.edu/system/shared/thumbnails/news/000/147/361/e9801948a3571637871183d7091368ae/xsmall.jpg?1739638480</ThumbnailUrl>
<ThumbnailUrl size="xxsmall">https://assets2-beta.my.umbc.edu/system/shared/thumbnails/news/000/147/361/e9801948a3571637871183d7091368ae/xxsmall.jpg?1739638480</ThumbnailUrl>
<PawCount>0</PawCount>
<CommentCount>0</CommentCount>
<CommentsAllowed>true</CommentsAllowed>
<PostedAt>Sat, 15 Feb 2025 12:03:24 -0500</PostedAt>
<EditAt>Sat, 15 Feb 2025 12:51:40 -0500</EditAt>
</NewsItem>

</News>
