Apptronik and the Human-Centered Robotics Lab at UT Austin have joined forces to develop a force augmentative exoskeleton called Sagittarius. The video shows a subject wearing a 12 degree-of-freedom human-interactive and high-power density lower-body exoskeleton developed by Apptronik. A whole-body augmentative exoskeleton control algorithm has been jointly developed, allowing the exoskeleton to remove gravitational payload while standing or walking.
This work explores the use of formal methods to construct human-aware robot controllers to support the productivity requirements of humans. We tackle these types of scenarios via human workload-informed models and reactive synthesis. This strategy allows us to synthesize controllers that fulfill formal specifications that are expressed as linear temporal logic formulas.
Here, we advance on real-time grasping pose estimation of single or multiple handles from RGB-D images, providing a speed up for assistive human-centered behaviors. We propose a versatile Bayesian framework that endows robots with the ability to infer various door kinematic models from observations of its motion. Combining this probabilistic approach with a state-of-the- art motion planner, we achieve efficient door grasping and subsequent door operation regardless of the kinematic model using the Toyota Human Support Robot.
Former PhD student Ye Zhao, now an assistant professor at Georgia Tech submitted a comprehensive paper on task and motion planning in collaboration with the Human Centered Robotics Laboratory. Click on the image below to access the paper in arXiv.
We are happy to share a preliminary video of our new track of work on transparency controller synthesis for augmentation exoskeletons performed by PhD students Binghan He and Gray Thomas. The hardware platform is designed by Apptronik.