We are happy to share that Miguel Arduengo, a joint student between Catalonia’s UPC and our lab at UT Austin, was awarded the best undergraduate thesis prize in artificial intelligence by the Catalan Association of AI. Click on the image for the pdf.
UT’s Human Centered Robotics Group student Steven Jens Jorgensen collaborates with NASA Johnson Space Center (JSC) and affiliates in testing a unique application for NASA’s Valkyrie humanoid robot to semi-autonomously dispose of explosives in a simulated urban scenario. Congratulations to NASA Johnson Space Center’s ER4 division for leading the effort and to personnel from TRACLabs, The Institute for Human Machine and Cognition (IHMC), Jacobs Technology, METECS, CACI, and The University of Texas at Austin for this team accomplishment.
Introducing our work on control of liquid cooled viscoelastic bipedal robots. Apptronik developed for UT Austin this excellent humanoid lower body robot, dubbed DRACO, and students at UT’s HCRL lab devised actuator control algorithms and integrated Time-to-Velocity Reversal Locomotion and Whole-Body Locomotion Control algorithms. The result is unsupported dynamic balancing of DRACO. A link to the paper preprint is here (click on the image):
Apptronik and the Human-Centered Robotics Lab at UT Austin have joined forces to develop a force augmentative exoskeleton called Sagittarius. The video shows a subject wearing a 12 degree-of-freedom human-interactive and high-power density lower-body exoskeleton developed by Apptronik. A whole-body augmentative exoskeleton control algorithm has been jointly developed, allowing the exoskeleton to remove gravitational payload while standing or walking.
This work explores the use of formal methods to construct human-aware robot controllers to support the productivity requirements of humans. We tackle these types of scenarios via human workload-informed models and reactive synthesis. This strategy allows us to synthesize controllers that fulfill formal specifications that are expressed as linear temporal logic formulas.
Here, we advance on real-time grasping pose estimation of single or multiple handles from RGB-D images, providing a speed up for assistive human-centered behaviors. We propose a versatile Bayesian framework that endows robots with the ability to infer various door kinematic models from observations of its motion. Combining this probabilistic approach with a state-of-the- art motion planner, we achieve efficient door grasping and subsequent door operation regardless of the kinematic model using the Toyota Human Support Robot.