Zebeth Media Solutions

robot

Watch two Mini Cheetah robots square off on the soccer field • ZebethMedia

Some robotics challenges have immediately clear applications. Others are more focused on helping systems solve broader challenges. Teaching small robots to play soccer against one another fits firmly into the latter category. The authors of a new paper detailing the use of reinforcement learning to teach MIT’s Mini Cheetah robot to play goalie note, Soccer goalkeeping using quadrupeds is a challenging problem, that combines highly dynamic locomotion with precise and fast non-prehensile object (ball) manipulation. The robot needs to react to and intercept a potentially flying ball using dynamic locomotion maneuvers in a very short amount of time, usually less than one second. In this paper, we propose to address this problem using a hierarchical model-free RL framework. Image Credits: Hybrid Robotics Effectively, the robot needs to lock into a projectile and maneuver itself to block the ball in under a second. The robot’s parameters are defined in an emulator, and the Mini Cheetah relies on a trio of moves — sidestep, dive, and jump – to block the ball on its way to the goal by determining its trajectory while in motion. To test the efficacy of the program, the team pitted the system against both a human component and a fellow Mini Cheetah. Notably, the same basic framework used to defend the goal can by applied to offense. The paper’s authors note, “In this work, we focused solely on the goalkeeping task, but the proposed framework can be extended to other scenarios, such as multi-skill soccer ball kicking.”

Touchlab to begin piloting its robotic skin sensors in a hospital setting • ZebethMedia

Manipulation and sensing have long been considered two key pillars for unlocking robotics’ potential. There’s a fair bit of overlap between the two, of course. As grippers have become a fundamental element of industrial robotics, these systems require the proper mechanisms for interacting with the world around them. Vision has long been a key to all of this, but companies are increasingly looking to tacticity as a method for gathering data. Among other things, it gives the robot a better sense of how much pressure to apply to a given object, be it a piece of produce or a human being. A couple of months back, Edinburgh, Scotland-based startup Touchlab won the pitch-off at our TC Sessions: Robotics event, among some stiff competition. The judges agreed that the company’s approach to the creation of robotic skin is an important one that can help unlock fuller potential for sensing. The XPrize has thus far agreed, as well. The company is currently a finalist for the $10 million XPrize Avatar Competition. The firm is currently working with German robotics firm Schunk, which is providing the gripper for the XPrize finals. Image Credits: Touchlab “Our mission is to make this electronic skin for robots to give machines the power of human touch,” co-founder and CEO Zaki Hussein said, speaking to ZebethMedia from the company’s new office space. “There are a lot of elements going into replicating human touch. We manufacture this sensing technology. It’s thinner than human skin and it can give you the position and pressure wherever you put it on the robot. And it will also give you 3D forces at the point of contact, which allows robots to be able to do dexterous and challenging activities.” To start, the company is looking into teleoperation applications (hence the whole XPrize Avatar thing) — specifically, using the system to remotely operate robots in understaffed hospitals. On one end, a TIAGo++ robot outfitted with its sensors lends human workers a pair of extra hands; on the other, an operator outfitted with a haptic VR bodysuit that translates all of the touch data. Though such technologies currently have their limitations. Image Credits: Touchlab “We have a layer of software that translates the pressure of the skin to the suit. We’re also using haptic gloves,” says Hussein. “Currently, our skin gathers a lot more data than we can currently transmit to the user over haptic interfaces. So there’s a little bit of a bottleneck. We can use the full potential of the best haptic interface of the day, but there is a point where the robot is feeling more than the user is able to.” Additional information gathered by the robot is translated through a variety of different channels, such as visual data via a VR headset. The company is close to beginning real-world pilots with the system. “It will be in February,” says Hussein. “We’ve got a three-month hospital trial with the geriatric patients in the geriatric acute ward. This is a world-first, where this robot will be deployed in that setting.”

Subscribe to Zebeth Media Solutions

You may contact us by filling in this form any time you need professional support or have any questions. You can also fill in the form to leave your comments or feedback.

We respect your privacy.
business and solar energy