Introducing our work on control of liquid cooled viscoelastic bipedal robots. Apptronik developed for UT Austin this excellent humanoid lower body robot, dubbed DRACO, and students at UT’s HCRL lab devised actuator control algorithms and integrated Time-to-Velocity Reversal Locomotion and Whole-Body Locomotion Control algorithms. The result is unsupported dynamic balancing of DRACO. A link to the paper preprint is here (click on the image):
Students and PI Enjoying the ICRA Conference
Talk Prof. Sentis at ICRA 2019
Our lab just returned from ICRA 2019 where we had a terrific time. We’re sharing a video of the talk at the workshop on legged robots: https://icra2019wslocomotion.wordpress.com/program
Sagittarius Force Augmentation Exoskeleton Revealed
Apptronik and the Human-Centered Robotics Lab at UT Austin have joined forces to develop a force augmentative exoskeleton called Sagittarius. The video shows a subject wearing a 12 degree-of-freedom human-interactive and high-power density lower-body exoskeleton developed by Apptronik. A whole-body augmentative exoskeleton control algorithm has been jointly developed, allowing the exoskeleton to remove gravitational payload while standing or walking.
New paper submissions lead by students Binghan and Rachel
B. He, H. Huang, G.C. Thomas, L. Sentis, Complex Stiffness Model of Physical Human-Robot Interaction: Implications for Control of Performance Augmentation Exoskeletons, Submitted, 2019
R. Schlossman, M. Kim, U. Topcu, L. Sentis, Toward Achieving Formal Guarantees for Human-Aware Controllers in Human-Robot Interactions, Submitted, 2019
Paper Submission: Robots Helping Humans to Coordinate Workload Backlog
This work explores the use of formal methods to construct human-aware robot controllers to support the productivity requirements of humans. We tackle these types of scenarios via human workload-informed models and reactive synthesis. This strategy allows us to synthesize controllers that fulfill formal specifications that are expressed as linear temporal logic formulas.
Paper Submission: Versatile Door Operation for Autonomous Door Operation
Here, we advance on real-time grasping pose estimation of single or multiple handles from RGB-D images, providing a speed up for assistive human-centered behaviors. We propose a versatile Bayesian framework that endows robots with the ability to infer various door kinematic models from observations of its motion. Combining this probabilistic approach with a state-of-the- art motion planner, we achieve efficient door grasping and subsequent door operation regardless of the kinematic model using the Toyota Human Support Robot.
arXiv Preprint: M. Arduengo, C. Torras, L. Sentis, A Versatile Framework for Robust and Adaptive Door Operation with a Mobile Manipulator Robot