Andrew Owens receives NSF CAREER Award for research to improve machine perception systems

Prof. Owens’ research will help fully autonomous systems interact with their environments without human supervision.
Andrew Owens
Andrew Owens

Prof. Andrew Owens received a National Science Foundation (NSF) Faculty Early Career Development Program (CAREER) Award to develop machine perception systems that build associations between touch, sound, and vision, allowing them to predict one physical characteristic from the others. The project, “Learning Multimodal Representations of the Physical World,” is under the Division of Information & Intelligent Systems.

Machine perception involves using machine learning algorithms to process sensory data from the environment––for example, computer vision can help autonomous vehicles navigate, detect fake or manipulated media, and aid doctors in analyzing data like CT scans. Owens aims to develop more advanced systems that use multiple types of sensor data to develop and predict a more complete picture of their surroundings.

“Missing from today’s vision-based models is an understanding that the objects around us can squish, deform, and bend in complex ways, and that their seemingly simple surfaces can hide weaves of fabric and tiny pores,” said Owens, Assistant Professor of Electrical Engineering and Computer Science.

These material properties and microgeometries, or nuanced variations, of 3D objects affect the ways that humans perceive and interact with their surroundings. For example, when tapped with a spoon, a glass champagne flute and its plastic lookalike make different sounds, indicating that the glass version is more fragile. In some cases, multimodal information carries serious implications––such as the crack of slippery ice underfoot.

After collecting paired images and touch samples throughout a room, the model can estimate touch at user-specified 3D positions within the scene. Video courtesy of Andrew Owens.

To enable artificial intelligence systems to understand these sensory links, Owens and his research group will collect data that associates the visual, auditory, and tactile properties of objects and surfaces at a variety of indoor and outdoor locations, using a series of cameras, microphones, and force sensors.

The datasets compiled by Owens and his team may ultimately enable automated robots to better interact with their surroundings––for example, grasping objects and walking over uneven surfaces like gravel or stairs. Owens plans to test his developed methods on robots, in collaboration with robotics researchers. 

“Improved tactile features can potentially be used in warehouse automation, manufacturing robotics, and medical diagnosis,” Owens said.

Owens is well-suited to this work, with previous research that associates visual stimuli with sound and incorporates visual and tactile sensory information into complete 3D scenes. He has also worked on improving robotic grasping. With the NSF CAREER Award, Owens will expand these lines of work to collect and apply “audio-visual-tactile” data simultaneously in a variety of 3D spaces.

A panel of 11 photos of a robotic arm grasping cups, boxes, and toys between two touch sensors, like a human using a pointer finger and thumb.
A robotic arm grasps and lifts 11 different objects using visio-tactile feedback. Image courtesy of Andrew Owens.

Owens earned his PhD at Massachusetts Institute of Technology (MIT) and was a postdoctoral researcher at University of California Berkeley prior to coming to Michigan in 2020. He was the recipient of the 2023-24 U-M College of Engineering 1938E Award, and the 2022 U-M EECS Outstanding Achievement Award. He has previously received research funding from DARPA, Toyota Research Institute, Sony, and Adobe.