I am an Associate Professor of Computer Science and Engineering at the University of Michigan. My work and collaborations aim to discover methods for computational reasoning and perception that will enable robots to effectively assist people in common human environments. This research pertains primarily to interactive robotics with contributions the technology of robot perception and mobile manipulation as well as the usability of this technology by people in real situations. Because science is exactly independently verifable knowledge, open-source contributions and reproducibility are a critical feature of my work.
My active projects include perceptual reasoning for goal-directed robotic manipulation, interactive systems for assisted robot teleoperation, and independent living technologies for aging populations. My past projects include robotic person following and gesture recognition, robot learning from demonstration, physics-based tracking of human motion from video, protocols and libraries for web/cloud robotics, markerless model and motion capture, inertial motion capture, balance control of simulated humanoids, and humanoid imitation learning.
An overview of my work and thoughts on web robotics (as a convergence of human-robot interaction and cloud robotics) was presented at National Geographic in 2013:
A popular spinout from this talk was my TED presentation with Henry Evans that goes into more detail about web robotics and remote presence for people with disabilities (which I guess is everyone):
A more-or-less complete listing of my publications is available from my profile on Google Scholar. This profile also includes approximate numbers on the citation counts and statistics for the "impact" of my research. Generating a long list of papers shows that I have been productive and busy. However, I suspect my real research impact comes from a small selelection of our best papers. To this end, I have selected my favorite papers from my publication list and provided them below for (relatively) easy reading:
- A. Fod, M. Mataric, O. Jenkins, Automated Derivation of Primitives for Movement Classification, Autonomous Robots, 12(1), pp. 39-54, 2002.
- O. Jenkins, M. Mataric, A Spatio-temporal Extension to Isomap Nonlinear Dimension Reduction, International Conference on Machine Learning (ICML), 2004.
- M. Vondrak, L. Sigal, O. Jenkins, Dynamical Simulation Priors for Human Motion Tracking, IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(1), pp. 52--65, 2013.
- D. Grollman, O. Jenkins, Incremental Learning of Subtasks from Unsegmented Demonstration, IEEE Conference on Intelligent Robots and Systems (IROS), 2010.
- C. Crick, S. Osentoski, G. Jay, O. Jenkins, Human and robot perception in large-scale learning from demonstration, IEEE/ACM Conference on Human-Robot Interaction (HRI), 2011.
and one more that has promise:
- Z. Sui, O. Jenkins, K. Desingh, Axiomatic Particle Filtering for Goal-directed Robotic Manipulation, IEEE Conference on Intelligent Robots and Systems, to appear, 2015.
- Fall 2015: U-M EECS 598-010 Interactive Robot Manipulators
- Spring 2014/5: Brown CS 148 Introduction to Autonomous Robotics
- Fall 2013/4: Brown CS 195E/2951P Human-Robot Interaction Seminar
Odest Chadwicke Jenkins, Ph.D., is an Associate Professor of Computer Science and Engineering at the University of Michigan. Prof. Jenkins earned his B.S. in Computer Science and Mathematics at Alma College (1996), M.S. in Computer Science at Georgia Tech (1998), and Ph.D. in Computer Science at the University of Southern California (2003). He previously served on the faculty of Brown University in Computer Science (2004-15). His research addresses problems in interactive robotics and human-robot interaction, primarily focused on mobile manipulation, robot perception, and robot learning from demonstration. His research often intersects topics in computer vision, machine learning, and computer animation. Prof. Jenkins has been recognized as a Sloan Research Fellow in 2009. He is a recipient of the Presidential Early Career Award for Scientists and Engineers (PECASE) for his work in physics-based human tracking from video. His work has also been supported by Young Investigator awards from the Office of Naval Research (ONR) for his research in learning dynamical primitives from human motion, the Air Force Office of Scientific Research (AFOSR) for his work in manifold learning and multi-robot coordination and the National Science Foundation (NSF) for robot learning from multivalued human demonstrations.