Image: SRI via YouTube SRI is developing a robotic system designed to make traffic stops safer. The prototype allows police officers to interact with a driver from their car. It even deploys spikes to prevent the driver from speeding away.
Dutch Police Training Eagles to Take Down Drones
Building Robots That Can Go Where We Go
Video Friday: Watch Boston Dynamics’ Spot Robots Pull a Truck
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):
Nîmes Robotics Festival – May 17-19, 2019 – Nîmes, France
Isolierband Robotics Competition – May 19, 2019 – Israel
ICRA 2019 – May 20-24, 2019 – Montreal, Canada
URC 2019 – May 30-1, 2019 – Hanksville, Utah, USA
2nd Annual Robotics Summit & Expo – June 4-6, 2019 – Boston, Mass., USA
ICUAS 2019 – June 11-14, 2019 – Atlanta, GA, USA
Energy Drone Coalition Summit – June 12-13, 2019 – Woodlands, Texas, USA
Hamlyn Symposium on Medical Robotics – June 23-26, 2019 – London, U.K.
ETH Robotics Summer School – June 27-1, 2019 – Zurich, Switzerland
MARSS 2019 – July 1-5, 2019 – Helsinki, Finland
Let us know if you have suggestions for next week, and enjoy today’s videos.
I’m Reuben Brewer, a Senior Robotics Research Engineer in SRI International’s Applied Technologies and Science Department (ATSD).
I’m here to show you my robot for making traffic stops safer for both Police officers and motorists. Every year: 16,915,140 drivers are pulled-over in traffic. 195,078 motorists have physical force used on them and 4,488 officers are assaulted. 89 of those motorists die, and 11 of those officers die. With such dangerous interactions between people, maybe it’s time to send a robot in between them, one that can’t hurt or be hurt. This prototype is a work in progress that I started in my garage and now work on at SRI international. It’s only part of the solution, but I hope one day it could save lives.
[ SRI ]
Nine teams hailing from four continents gathered in Idaho Springs, Colorado, the week of April 5-11, 2019, to test autonomous air and ground systems for navigating the dark, dangerous, dirty, and unpredictable underground domain. The SubT Integration Exercise, known as STIX, took place at the Colorado School of Mines’ Edgar Experimental Mine. The event provided a shakeout opportunity for competitors in advance of the Tunnel Circuit in August, the first of three subdomains that teams will tackle in DARPA’s Subterranean Challenge.
[ DARPA ]
This is some insane skill from IHMC’s Atlas, walking over wobbly bricks and planks barely wider than its own feet. And check out how it puts one foot directly in front of the other, which makes it much more difficult to balance.
Atlas humanoid robot (DRC version) walking across narrow terrain using autonomous planning. The robot senses the terrain with LIDAR and builds a map of planar regions. A path planning algorithm plans footsteps across the planar regions to a goal location, specified by an operator. The robot is currently about 50% successful over this type of terrain. We plan to increase the rate of success by adding balance using angular momentum and by better considering joint ranges of motion. Narrow terrain is difficult due to the need to do some “cross-over” steps, which are tricky due to limited range of motion in the hip joint, and also due to having a small polygon of support when one foot is directly in front of the other. Control, Perception, and Planning algorithms by IHMC Robotics. Atlas robot built by Boston Dynamics. Walking recorded on May 1, 2019.
[ IHMC ]
We present ZeRONE, a new indoor drone that does not use rotating blades for propulsion. The proposed device is a helium blimp type drone that uses the wind generated by the ultrasonic vibration of piezo elements for propulsion. Compared to normal drones with rotating propellers, the drone is much safer because its only moving parts are the piezo elements whose surfaces vibrate at the order of micrometers. The drone can float for a few weeks and the ultrasonic propulsion system is quiet. We implement a prototype of the drone and evaluate its performance and unique characteristics in experiments.
[ ZeRONE ]
Scaling up the software system on service robots significantly increases the maintenance burden of developers and the risk of resource contention of the computer embedded with robots. As a result, developers spend much time on configuring, deploying, and monitoring the robot software system; robots utilize significant computer resources when all software processes are running. We propose Rorg, a Linux container-based scheme to manage, schedule, and monitor software components on service robots. Rorg allows the developers to pack software into self-contained images and runs them in isolated environments using Linux containers; it also allows the robot to turn on and off software components on demand to avoid resource contention. We evaluate Rorg with a long-term autonomous tour guide robot: It manages 41 software components on the robot and relieved our maintenance burden, and it also reduces CPU by 45.5% and memory usage by 16.5% on average.
Hey, did you spot Diego the giant robot baby in the background there? I sure did!
[ UCSD ]
Cassie Blue takes a field trip. Look at that sensor-packed torso! Spoiler alert: Cassie and gravel piles don’t get along.
Meditative movement involves regulating attention to the body whilst moving, to create a state of meditation. This can be difficult for beginners, we propose that drones can facilitate this as they can move with and give feedback to movements. We designed a two-handed control map for the drone that engages multiple parts of the body, a light foam casing to give the impression that the drone is floating and an onboard light which gives feedback to the speed of the movement. The user experiences both leading and following the drone to explore the interplay between mapping, form, feedback and instruction, relating to an expansion of the attention regulation framework, which is used to inform the design of interactive meditative experiences and human-drone interactions.
Now just swat that drone like a fly—and meditate in peace.
You’d think that soft terrain would be comfortable for a robot to walk on, but it sounds like teaching a robot how to stay stable when traversing things that are squishy is tricky. IIT and HyQ are on it, though.
We present STANCE which stands for Soft Terrain Adaptation and Compliance Estimation. STANCE is an online soft terrain adaptation algorithm that can adapt online to any type of terrain compliance (stiff or soft). STANCE allows HyQ to adapt its locomotion strategy depending on the type of terrain. As a result, HyQ was able to traverse and transition between multiple terrains with different compliance without pre-tuning.
Sometimes, robots are going to fail. That’s fine. The key is being able to recover, and keep going with your task.
Possible failure modes during bin picking are (1) not finding appropriate objects and (2) false-positive objects that lead to empty grasps.The video shows various recovery strategies including moving the camera into different positions and detecting false-positives using force sensing.
Flyability has a new version of their fully protected indoor inspection drone-in-a-ball.
This is Elios 2. Made for indoor inspection, it is the most intuitive, reliable, and precise drone, designed for the tough jobs. Elios 2 is remarkably intuitive to fly, making anyone feel like a seasoned pilot, right from the first flight.
– A brand new collision-resilient design
– GPS-free stabilization in dark and troubled air flows
– Distance lock to follow long and repetitive features
– A 12MP camera with a stunning 0.18 mm/px resolution
– The most powerful and intelligent lighting system ever built on a commercial drone
[ Flyability ]
The Mogao Caves in China are a complex of hundreds of caves crammed full of 1,000 years worth of Buddhist art. As you might expect, people want to see this stuff, but millions of them showing up every year are taking a toll. One idea is to replace some of the more fragile statues with “clones” made with 3D scans and robot artists.
[ Asahi ]
Training Cassie to walk in simulation using deep reinforcement learning transfers better than you might expect onto the real robot.
The semi-structured and repetitive environment of a container port is unsurprisingly an ideal place for autonomous vehicles.
In a first-ever advancement in human medicine and aviation technology, a University of Maryland unmanned aircraft has delivered a donor kidney to surgeons at the University of Maryland Medical Center (UMMC) in Baltimore for successful transplantation into a patient with kidney failure. This successful demonstration illustrates the potential of unmanned aircraft systems (UAS) for providing organ deliveries that, in many cases, could be faster, safer, and more widely available than traditional transport methods.
[ UMD ]
HEBI Robotics would like you to meet Daisy, a hexapod robotic kit with 18 degrees of freedom. Daisy is a perfect companion for gait and motion control research—and she comes ready to deploy right out of the box.
Daisy is also known as “X-Monster.”
[ HEBI Robotics ]
Five robots, 13 video screens, one trade-show booth.
Here’s what happens over the course of one week at a trade show, from set up to break down with thousands of attendees roaming the floor in between, all boiled down into a 30 second time lapse recap.
Middle Size RoboCup is one of my favorite events—big robots, full autonomy, playing football/soccer in a way that’s at least a little bit competitive with humans. Here’s a bunch of videos from Tech United, featuring highlights from the semi-final match, the final match, and then a couple bonus videos with footage from a drone overhead as well as a camera on one of the robots.
This week’s CMU RI Seminar comes from CMU’s Henny Admoni, on “Understanding Human Behavior for Robotic Assistance and Collaboration.”
Human-robot collaboration has the potential to transform the way people work and live. Researchers are currently developing robots that assist people in public spaces, on the job, and in their homes. To be effective assistants, these robots must be able to recognize aspects of their human partners such as what their goals are, what their next action will be, and when they need help—in short, their task-relevant mental states. A large part of communication about mental states occurs nonverbally, through eye gaze, gestures, and other behaviors that provide implicit information. Therefore, to be effective collaborators, robots must understand nonverbal human communication as well as generate sufficiently expressive nonverbal behaviors that are understandable by their human partners. Developing effective human-robot interactions requires a multidisciplinary approach that involves fundamental robotics algorithms, insights from human psychology, and techniques from artificial intelligence, machine learning, and computer vision. In this talk, I will describe my work on robots that collaborate with and assist humans on complex tasks, such as eating a meal. I will show how robots can guide human action using nonverbal behaviors, and how natural, intuitive human behaviors can reveal human mental states that robots must respond to. Throughout the talk, I will describe how techniques and knowledge from cognitive science help us develop robot algorithms that lead to more effective interactions between people and their robot partners.
[ CMU RI ]