The HCRL has been quite productive with submission of the papers below. Congratulations to all the HCRL researchers!
R. Schlossman, G.C. Thomas, L. Sentis, Toward Exploiting the Natural Dynamics of Series Elastic Robots by Actuator-Centric Sequential Linear Programming, Submitted on September 2017
D.H. Kim, J. Ahn, J. Lee, O. Campbell, L. Sentis, Whole-Body Control Incorporating Slip, Inequality Constraints, and Task Hierarchy, Submitted on September 2017
S.J. Jorgensen, O. Campbell, T. Llado, J. Lee, B. Shang, L. Sentis, Prioritized Kinematic Control of Joint-Constrained Head-Eye Robots using the Intermediate Value Approach, Submitted on September 2017
M. Kim, J. Lee, S.J. Jorgensen, L. Sentis, Social Navigation Planning Based on People’s Awareness of Robots, Submitted on September 2017
J. Lee, D.H. Kim, K.S. Kim, L. Sentis, Versatile Whole Body Controllers for Constrained and Underactuated Robots: Actuating Torque and Acceleration Energy Minimizations, Submitted on September 2017
J. Ahn, O. Campbell, D.H. Kim, L. Sentis, Fast, Sampling-Based Kinodynamic Bipedal Locomotion Planning with Moving Obstacles, Submitted on September 2017
This video summarizes the deployment of applications on Valkyrie performed by Steven Jens Jorgensen from the U. of Texas at Austin and other researchers at NASA, IHMC, and U. Michigan. Given desired end-effector poses, a nonlinear optimization routine is used to solve the whole-body Inverse Kinematics (IK) of NASA’s Valkyrie robot while satisfying balance constraints. The joint position solutions are converted to the appropriate messages and are sent to IHMC’s controller interface, which interpolates between the robot’s initial (current) configuration to the desired configuration using third-order functions (polynomial for positions and a hermite curve for orientations). By specifying just the hand pose, a preliminary grasp planner uses the Whole-body IK solver to command Valkyrie’s hand to the desired pose. The Whole-body work is a collaboration between NASA and the Human-Centered Robotics Lab (HCRL) at the University of Texas at Austin. The grasp planner is from The Laboratory for Progress at the University of Michigan. This work was partially supported by a NASA Space Technology Research Fellowship (NSTRF) under the grant number NNX15AQ42H.
The video below shows the new Draco liquid cooled prototype leg produced as a collaboration between the Human Centered Robotics Lab and Apptronik Systems.. We have developed a new liquid cooled viscoelastic actuator capable of significantly surpassing the mechanical power of legged systems with convection cooling. The leg was presented during the Office of Naval Research S&T Expo in Washington DC.
Big congratulations to Ye for having his theoretical locomotion paper accepted!!
Y. Zhao, B. Fernandez, L. Sentis, Robust Optimal Planning and Control of Non-Periodic Bipedal Locomotion with A Centroidal Momentum Model, International Journal of Robotics Research, Accepted, July 2017
Gray, Donghyun and Luis had a blast talking about our new lab’s pet topic “Uncertainty in Human-Centered Robots” #icra2017 #singapore #marinabaysands
Abstract: Uncertainty permeates in all control approaches and significantly complicates controller design. This is specially true for human-centered robots which rely on oversimplications such as ignoring high-frequency behaviors or realtime delays to central computers. In this talk Luis will join forces with HCRL students Gray and Donghyun to present detailed mathematical work on choosing structure for measuring uncertainty in a meaningful statistical sense, motivate the nature of uncertainty in hardware systems involving high performance series elastic actuators, and devise a planning and control framework that embraces uncertainty to external disturbances via reinforcement learning of locomotion responses.
PI, Luis Sentis gave a 45 minutes interview during SXSW discussing cognitive computational control and human-centered robotics. Here is a preview snapshot. The full interview will be aired later in the year by XPRIZE.
Qualification Material for RoboCup@Home 2017: Semi-autonomous grocery organization using the Dreamer compliant humanoid robot with whole-body control. Speech recognition and synthesis enabled.
Participants: Kwan Suk Kim, Jaemin Lee, Minkyu Kim, Steve Jorgensen, Luis Sentis
Title: A Developer’s Primer for Coding Human Behavior in Bots
Abstract: This session demonstrated ways to model human-robot interaction (HRI) using a practical coding scenario. We explored how to program a humanoid robot head that manages its eye contact to maximize “connection” and minimize “social awkwardness” in human interaction. This session addressed practical computational questions and frame cognitive modeling problems based on intuitive mechanical analogies. We leveraged the power of feedback Whole-Body Control to generate useful behaviors – and demonstrated the results on the Dreamer Humanoid robot head!
Participants: Steven Jorgensen, Travis Llado, Orion Campbell, Luis Sentis