University of Colorado Boulder’s Gabe Sibley presents “Mobile Robot Perception for Long-term Autonomy” as part of the IRIM Robotics Seminar Series. The event will be held in the TSRB Banquet Hall from 12-1 p.m. and is open to the public.
This talk will cover recent advances in mobile perception, planning and control from the Autonomous Robotics and Perception Group (ARPG) at the University of Colorado, Boulder. We will discuss results ranging from scalable visual-inertial SLAM, dense 3D SLAM, semantic SLAM, unsupervised object discovery, and photometric SLAM, and how these approaches can be tightly integrated using model predictive control for high-speed ground vehicles.
Gabe Sibley is an assistant professor in Computer Science at the University of Colorado, Boulder. Before joining CU, Sibley was an assistant professor in Computer Science at George Washington University and director of the Autonomous Robotics & Perception Lab.
Previously, he was a junior research fellow at Somerville College, Oxford, and a post-doctoral research assistant in the Mobile Robotics Group of the Oxford University Engineering Department working with Professor Paul Newman.
Sibley was a Ph.D. student at the Robotic Embedded Systems Laboratory at the University of Southern California under the supervision of Professor Gaurav Sukhatme and a Robotics Engineer in the Computer Vision Group at NASA-JPL under Dr. Larry Matthies. At NASA-JPL, Sibley worked on long-range data-fusion algorithms for planetary landing vehicles, unmanned sea vehicles, and unmanned ground vehicles.
Sibley’s core interest is in probabilistic perception algorithms and estimation theory that enable long-term autonomous operation of mobile robotic systems, particularly in unknown environments. He has extensive experience with vision-based, real-time localization and mapping systems, and he is interested in fundamental understanding of sufficient statistics that can be used to represent the state of the world. His research uses real-time, embodied robot systems equipped with a variety of sensors, including lasers, cameras, and inertial sensors to advance and validate algorithms and knowledge representations that are useful for enabling long-term autonomous operation.