MLFL Wiki |
Main /
Robot Perception For ManipulationAbstract: Our team (MIT MCube Lab) participated in Amazon Robotics Challenge (ARC) in 2015 - 2017, and won the stowing task in 2017. The challenge was to develop a fully autonomous system to pick and place products in a warehouse setting, where various products may coexist inside a single bin. Robots are still far from achieving human speed, flexibility, and reliability. In the first half of the talk, I will describe our approach to the ARC, list the lessons we have learned, and then propose our wish list regarding technologies that will be useful for developing similar systems in the future. One major item in the wish list is to exploit physics and contact sensing in the perception system, where vision usually plays a big role but is still insufficient in practice. This can be due to occlusions, inaccurate models, ambiguous appearances, etc. I will spend the second half of the talk on our effort on using tactile sensing to estimate the pose of an object in realtime. I will draw the connections between our problem and mobile robot localization problem and show how we can apply frameworks developed in the localization community. Our results show that incorporating extra information can improve the accuracy from vision alone. Moreover, even when visual cues are temporarily missing, our system can still reason about the object state during manipulation. Bio: Peter K.T. Yu received the degrees of B.S. in Computer Science from National Chiao-Tung University, Hsinchu, Taiwan, in 2010, and M.S. in Computer Science from National Taiwan University, Taipei, Taiwan, in 2012. He is currently a Ph.D. candidate of Electrical Engineering and Computer Science at Massachusetts Institute of Technology under the supervision of Prof. Alberto Rodriguez in the Manipulation and Mechanisms Lab. He focuses on state estimation involving contact to enable reactive control in contact manipulation tasks. |