Many startups believe there is no need to rethink the gripper due to its cost and resource intensiveness. MIT’s new approach to robotic gripping is compared to manipulating an arcade claw, but with real-time reflexes and feedback. An algorithm instructs the robot to perform one of three grasp maneuvers in response to real-time measurements, allowing the fingers to grab, pinch, or drag an object until it has a better hold. The gripper is designed to adapt to each object’s particular shape and weight distribution, using sensors on the tips and feedback from a camera on the base. The team successfully demonstrated the system’s ability to pick and place household objects without having to re-adjust in more than 90% of the 117 attempts.
The project builds on actuators developed for the school’s mini cheetah robot, allowing it to react to uneven terrain on the fly. The new system is built around an arm with two multi-joint fingers using a camera on the base and sensors on the tips. By adjusting accordingly to an object in real-time, the gripper can pick up objects with strange or unexpected weight distributions, without having to withdraw and try again.