Video Friday: Robotic Gecko Gripper, and More

OnRobot Gecko Gripper Image: OnRobot

Advertisement

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

HRI 2019 – March 11-14, 2019 – Daegu, Korea
RoboSoft 2019 – April 14-18, 2019 – Daegu, Korea
Nîmes Robotics Festival – May 17-19, 2019 – Nîmes, France
ICRA 2019 – May 20-24, 2019 – Montreal, Canada
2nd Annual Robotics Summit & Expo – June 4-6, 2019 – Boston, Mass., USA

Let us know if you have suggestions for next week, and enjoy today’s videos.


[embedded content]

The Gecko Gripper uses the same adhesive system for gripping as the feet of a gecko, with millions of fine fibers that adhere to the surface of the workpiece and generate strong van der Waals forces.

[ OnRobot ]


Bombs? Who cares about bombs! Let’s shovel some SNOW!

[embedded content]

WHERE WERE YOU WHEN I NEEDED YOU ROBOT!

And then Endeavor spoils everything by reminding us that “this is humor. Snow clearing is not the robot’s primary mission.”

[ Endeavour ]


Tech United’s Turtles, which compete in the RoboCup midsize league, are looking particularly skilled this year:

[embedded content]

[ Tech United ]


Harvard’s CS189 class from last year involved candy delivery, because what could be better than candy delivery?

[embedded content]

Welcome to the latest in Automated StoreFronts: FETCH-Candy! A candy store, where robots fetch you candies of your choice from dispensers and bring them to you. A robot greets you at the service station where you place your order, then it goes to the dispenser with the candy you chose, activates the dispenser, and brings the candy back to the service station for you to pick up. Many robots work at once, whizzing about, but never getting in each other’s way, even taking turns at busy dispensers. Always polite and always chatty, they make sure you love the fetch candy experience!

[ Harvard ]


I’d make some joke about how this 15-second clip of a Universal arm and Robotiq gripper kicking (?) a field goal is more interesting than the actual Super Bowl was, but that wouldn’t be much of a joke, would it?

[embedded content]

[ Robotiq ]


Nybble, everyone’s favorite open source robotic kitten went to CES to meet some (marginally less cute) robots.

[embedded content]

[ Nybble ]


Building upon previous soft exosuit technology, researchers at the Wyss Institute and Harvard SEAS have developed a soft exosuit for running. This exosuit applies forces to the hip joint using thin, flexible wires, assisting the muscles during each stride. Using an off-board actuation system, compared to not wearing the exosuit, this exosuit can reduce the metabolic cost of running by 5.4%.

[embedded content]

[ Wyss ]


In Stevenage, UK, a rover is being built that will carry a drill and a suite of instruments dedicated to exobiology and geochemistry research. It will be the first mission to combine the capability to move across the surface and to study Mars at depth.

The primary goal of the ExoMars programme is to address the question of whether life has ever existed on the red planet.

[embedded content]

[ ESA ]


Cobalt Robotics, co-founded by special friend ’o the blog Travis Deyle, has gotten to the point in their corporate trajectory where they’re producing slick videos about their robot.

[embedded content]

[ Cobalt ]


The Visual Teach and Repeat (VT&R) Package is a vision-based outdoor navigation package designed for research and application development. The kit enables a robot to be taught an outdoor path, and then reliably and accurately repeat that path in environments using only a stereo camera. VT&R is capable of navigating GPS-denied environments and can handle a degree of changing lighting or weather conditions.

[embedded content]

You know, if you make your robot big enough, obstacle avoidance starts to become much less of a problem.

[ Clearpath ]


Switzerland is the land of innovation, thanks mostly to ANYmal. Featuring a special guest appearance from Flyability!

[embedded content]

[ ANYbotics ]


Canadian Space Agency astronaut Jeremy Hansen gives you a tour of the Robotics Mission Control Centre.

[embedded content]

[ CSA ]


In cooperation with our partner Dürr we developed a smart automation process for the headlight adjustment of the brand new Ford Focus. The operator and the sensitive lightweight robot KUKA LBR iiwa work safely in the same work area without a safety fence.

[embedded content]

[ Kuka ]


Making espresso: so easy, a robot can do it.

[embedded content]

[ RE2 ]


In this video ROUGHIE is performing a variety of advanced soaring maneuvers, proving that the ROUGHIE is the most maneuverable internally actuated underwater glider. The vehicle utilizes its unique roll mechanism design together with the switching control strategy to concatenate a series of steady wings-level and turning flights and accomplish behaviors comparable to air gliders.

[embedded content]

[ Michigan Tech ]


Torc and AAA Northern California, Nevada & Utah partnered together to develop safety criteria for self-driving vehicles to ensure safety in AV development and deployment. In November 2018, the partners completed a two-week challenge that put Torc’s self-driving software through a set of comprehensive safety scenarios, ranging from simple to complex.

[embedded content]

[ Torc ]


Herbert Simon started teaching at CMU in 1949, and in 1978, he won a Nobel Prize. The prize was in economics, but he’s also considered a founder of the field of artificial intelligence, and in this 1985 interview he discusses the previous 30 years of AI research at CMU.

[embedded content]

[ CMU ]


Did you know that by 2035 Artificial Intelligence is projected to potentially double the economic growth of developed countries and increase labour productivity by 40%? By 2030, the McKinsey Global Institute estimate that between zero and one third of work activities could be displaced, and that 3-14% of the global workforce will need to switch occupational categories and learn new skills. However, despite this shift in the job market there are exciting new jobs and innovations on the horizon.

Exploring the future use of AI technology and the implications of AI on human society. Professor Alan Winfield, Professor of Robot Ethics, University of the West of England, Bristol led a dialogue on the future use of AI in our everyday lives and the ethical implications of this. This was followed by a moderated discussion with Professor Alnoor Bhimani, Professor of Management Accounting, London School of Economics, addressing topics such as the possible benefit and impact of AI in improving the quality of life.

[embedded content]

[ AI Futures ]


Introductory lecture of the MIT Self-Driving Cars series (6.S094) with an overview of the autonomous vehicle industry in 2018 and looking forward to 2019, including Waymo, Tesla, Cruise, Ford, GM, and out-of-the-box ideas of boring tunnels, flying cars, connected vehicles, and more. This covers the state of the art in terms of industry developments and not the perception and planning algorithm development. The latter will be covered in detail in future lectures. For more lecture videos on deep learning, reinforcement learning (RL), artificial intelligence (AI & AGI), and podcast conversations, visit our website or follow TensorFlow code tutorials on our GitHub repo.

[embedded content]

[ MIT ]


This week’s CMU RI Seminar comes from CMU’s own Yaser Ajmal Sheikh, on “Social Perception for Machines.”

[embedded content]

Despite decades of progress, machines remain intelligent tools rather than collaborative partners in individual human enterprise. A key reason is that machine perception of inter-personal communication is largely unsolved and a computationally accessible representation of such behavior remains elusive. In this talk, I will describe our research arc over the past decade at CMU to make human signaling a perceptible channel of information for machines. This research includes the construction of the Panoptic Studio, a multisensor facility designed to capture social behavior, and the development of Open Pose, a realtime 2D pose estimation approach whose demo you may have encountered on the fourth floor of NSH. I will share recent progress in moving from the lab to the real world and discuss futures in this research expedition.

[ CMU RI ]


IEEE SR

Leave a Reply