Image: AIST via YouTube Need a hand finishing your basement?
How Aldebaran Robotics Built Its Friendly Humanoid Robot, Pepper
Honda Developing Disaster Response Robot Based on ASIMO
Calling All Robots
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):
ROSCon 2018 – September 29-30, 2018 – Madrid, Spain
IROS 2018 – October 1-5, 2018 – Madrid, Spain
International Robot Safety Conference – October 9-11, 2018 – Detroit, Mich., USA
Japan Robot Week – October 17-19, 2018 – Tokyo, Japan
Collaborative Robots, Advanced Vision & AI Conference – October 24-25, 2018 – Santa Clara, Calif., USA
ICSR 2018 – November 28-30, 2018 – Qingdao, China
Let us know if you have suggestions for next week, and enjoy today’s videos.
[ AIST ]
Look, a wheeled ANYmal!
Look, an armed ANYmal!
We realized manipulation of Rubik’s cube using a high-speed robot hand with three fingers. The experimental system consists of a high-speed vision and a high-speed robot hand, and the high-speed vision can calculate the center of gravity position and angle of the Rubik’s cube at 500 fps.
The manipulation realized in this research is a total of three operations, two kinds of regrasping and one-face turning of the Rubik’s cube. By combining these three operations all the faces can be turned. In the experiment, these three operations were performed in a row in 1 second and we succeeded in 30 continuous operations in 10 seconds.
ROBOTIS has a bunch of surprises it will be showing off at ROSCon and IROS. Here’s one of them.
We set up ROS2 topics for each XEL using the XEL Network GUI tool and created a mobile robot by creating ROS2 nodes that utilize these topics on the PC.
[ ROBOTIS ]
When it comes to saving lives, you have to be sure you’re using the right tools. That’s why DJI invests in rigorously testing its products for use in the field. It’s not just about how a drone can fly—it’s about how people use it.
[ DJI ]
With a focus on the people and perseverance behind DARPA’s ability to make the impossible possible, trace the agency’s history from its charter following the Soviet Union’s Sputnik launch to advances across a spectrum of technologies. Building on a legacy of innovation, DARPA continues to push technological boundaries to ensure U.S. military superiority and serve the people who serve and protect our nation.
This short documentary debuted at DARPA’s 60th anniversary symposium, D60, which took place Sept. 5-7, 2018, at Gaylord National Harbor, Oxon Hill, Maryland.
[ DARPA ]
Some decisions are easy, like choosing Vector, the home robot who won’t destroy humanity. He was made by Anki to be a helpful, robotic addition to your life.
[ Anki ]
Did you miss this year’s RUC, the Robotiq User Conference? Here’s a recap of all that you’ve missed. And don’t miss next year’s RUC.
[ Robotiq ]
The bio-inspired, six-legged walking robot LAURON V participated as team LAUROPE (LAUfROboter für die Planetare Exploration, walking robot for planetary exploration) in the DLR SpaceBot Camp 2015. The challenge was initiated by the German Aerospace Center (DLR Raumfahrtmanagement) as a performance evaluation for space robotics. Ten research teams from all over Germany developed rover concepts to tackle one of the most challenging scenarios of space flight – the exploration of an unknown planet. LAURON V was equipped with a lightweight custom-built gripper on its leg, a 3D laser scanner system for autonomous navigation, and multiple cameras to cope with the various challenges, such as difficult terrain, steep slopes and the acquisition of samples. During the challenge, LAURON could not only show its terrain adaptability and flexibility of its behaviour-based control system by walking into a simulated crater, but also its autonomous capabilities. He acquired a 3D map of the environment, gathered and transported samples and autonomously navigated through the environment to explore and find all the objects. His human colleagues could only supervise the action and send high-level commands from a control station far away from the robot and with an added delay as it would be present in a real space mission.
[ FZI ]
Every year KUKA hosts the Innovation Award – a global competition for research teams comprised largely of students and their mentor professors to address hot topics around robotics research. The teams compete for a 20K Euro prize and the winner is chosen by an impartial, independent jury from the robotics research community. The Innovation Award 2018 focused on research into robots that can interact equally well both inside and outside the industrial environment, with an emphasis on direct support for humans. The concept and presentation of the applications should be as realistic as possible.
And check the link below for next year’s finalists, which will demo their robot systems at the Hannover Messe 2019 in Germany.
[ Kuka ]
FIBERBOTS is a digital fabrication platform fusing cooperative robotic manufacturing with abilities to generate highly sophisticated material architectures. The platform can enable design and digital fabrication of large-scale structures with high spatial resolution leveraging mobile fabrication nodes, or robotic “agent” designed to tune the material make-up of the structure being constructed on the fly as informed by their environment.
[ MIT Media Lab ]
Check out the ultimate “Hello World” program with our latest and greatest, Sphero BOLT. This video introduces BOLT, our most advanced robotic ball yet, and its 5 new features: a huge battery, an 8×8 LED Matrix, a light sensor, a compass, and infrared sensors.
[ Sphero ]
Robots with knives. Do we really think that’s a good idea?
Effectiveness of cutting is measured by the ability to achieve material fracture with smooth knife movements. The work performed by the knife overcomes the material toughness, acts against the blade-material friction, and generates shape deformation. This paper studies how to control a 2-DOF robotic arm equipped with a force/torque sensor to cut through an object in a sequence of three moves: press, push, and slice. For each move, a separate control strategy in the Cartesian space is designed to incorporate contact and/or force constraints while following some prescribed trajectory. Experiments conducted over several types of natural foods have demon- strated smooth motions like would be commanded by a human hand.
Toshiba is showing off its automated bin-picking system. Look Ma, no prior setup or teaching!
Without prior setup or teaching, Toshiba robot automatically recognizes and picks items in various shapes which are stacked randomly. Toshiba robot not only handles items carefully but also pack items efficiently without empty spaces.
[ Toshiba ]
The Open Vision Computer (OVC) was designed to support high speed, vision guided autonomous drone flight. In particular our aim was to develop a system that would be suitable for relatively small-scale flying platforms where size, weight, power consumption and computational performance were all important considerations. This manuscript describes the primary features of our OVC system and explains how they are used to support fully autonomous indoor and outdoor exploration and navigation operations on our Falcon 250 quadrotor platform.
[ Vijay Kumar ]
Dyson people love a challenge. Each year, people across the Dyson world team-up to take part in Challenge Dyson, a light-hearted internal event focused around unusual challenges. This year was no exception, as teams geared up to race remote controlled vehicles made out of foam around a giant world map obstacle course.
[ Dyson ]
Here’s an overview of CMU’s MRSD (MS in Robotic Systems Development).
[ CMU ]
Compilation of autonomous thruster-based docking experiments run in the Spacecraft Simulator Facility at Caltech. When the thrusters fire, the blue LEDs flash and a tick sound can be heard.
[ Caltech ]
This week’s CMU RI Seminar: Michael Kaess on “Factor Graphs for Robot Perception.”
Factor graphs have become a popular tool for modeling robot perception problems. Not only can they model the bipartite relationship between sensor measurements and variables of interest for inference, but they have also been instrumental in devising novel inference algorithms that exploit the spatial and temporal structure inherent in these problems. I will overview some of the inference algorithms and present two specific applications: Simultaneous localization and mapping for underwater robots and state estimation for aerial robots. For state estimation I will introduce a novel fixed-lag smoother for visual-inertial odometry. I will also give a brief overview of factor graphs in the context of other robot perception problems.
[ CMU RI ]