Video Friday: Andy Rubin on Robotics, Dynamic Exoskeleton, and Two Robot Heads
Your weekly selection of awesome robot videos
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):
ICRA 2018 – May 21-25, 2018 – Brisbane, Australia
Dynamic Walking Conference – May 21-24, 2018 – Pensacola, Fla., USA
RoboCup 2018 – June 18-22, 2018 – Montreal, Canada
RSS 2018 – June 26-30, 2018 – Pittsburgh, Pa., USA
Ubiquitous Robots 2018 – June 27-30, 2018 – Honolulu, Hawaii
MARSS 2018 – July 4-8, 2018 – Nagoya, Japan
AIM 2018 – July 9-12, 2018 – Auckland, New Zealand
ICARM 2018 – July 18-20, 2018 – Singapore
ICMA 2018 – August 5-8, 2018 – Changchun, China
SSRR 2018 – August 6-8, 2018 – Philadelphia, Pa., USA
Let us know if you have suggestions for next week, and enjoy today’s videos.
Boston Dynamics CEO Marc Raibert returns to TechCrunch Sessions Robotics and doesn’t disappoint, talking about the Google acquisition, Masayoshi Son’s 300-year technology investment plan, and SpotMini’s “butt-cam.”
[ TC Sessions Robotics ]
Playground founder and CEO Andy Rubin was also a guest at TC Sessions Robotics. He talked about what he learned from leading the Google robotics program, how he sees his robot investments as pieces of a big puzzle, and how cloud-connected robots will learn to never spill milk.
Watch every panel from TC Sessions Robotics at the link below.
[ TC Sessions Robotics ]
First dynamic exoskeleton walking with paraplegics. Achieved through a mathematical control framework for dynamic locomotion realized on the ATALANTE exoskeleton developed by Wandercraft.
[ AMBER-Lab ]
Warning: This video contains scenes of researchers performing physical activity.
The system captures human motion, analyzes skeletal movements and muscle activities, and visualizes the results at 30 fps with 400-500ms latency. The current system uses four video cameras sharing different scopes of a human motion. The images of each camera captured at 30 fps are processed by a PC with GPU for OpenPose computation.The results from the four PC flow to a PC for 3D reconstruction of skeletal movements of higher dof. Inverse dynamics and muscle activity analysis then follows before visualization.
[ Nakamura & Yamamoto Lab ]
We can do this research too!
The new QDrone, a key component of the Quanser Autonomous Vehicles Research Studio, was designed to withstand all kinds of crashes, slams, and accidents. We wanted to give researchers a reliable and robust system, so they can test and tune their controllers without worrying about the vehicle damage.
[ Quanser ]
I think they are saying, “Where’s the rest of my body?!”
[ FLASH Robotics ]
Every walking robot should wear sneakers.
Cassie Blue is operating with our normal flat-ground feedback controller. We made no changes to accommodate the sand. We were quite surprised that she could walk barefoot in such soft sand! Our best guess is that this ability is a fortunate outcome of the way we control her “feet”. When a leg is in the air, we control the corresponding foot to be level with the ground. When a leg is in contact with the ground, we set the “ankle” torque to zero. In other words, we are treating the robot as being underactuated.
[ Dynamic Legged Locomotion Lab ]
DARPA recently announced a US $38.6 million contract awarded to
make its Gremlins reusable drone swarm real.
DARPA is progressing toward its plan to demonstrate airborne launch and recovery of multiple unmanned aerial systems (UAS), targeted for late 2019. Now in its third and final phase, the goal for the Gremlins program is to develop a full-scale technology demonstration featuring the air recovery of multiple low-cost, reusable UAS, or gremlins.
[ Dynetics ]
The Mars Helicopter is a technology demonstration that will travel to the Red Planet with the Mars 2020 rover. It will attempt controlled flight in Mars’ thin atmosphere, which may enable more ambitious missions in the future.
[ NASA JPL ]
Misty Robotics founder and head of product Ian Bernstein gives an overview of Misty II’s features and innards.
[ Misty Robotics ]
On the street circuit of Formula E Rome, pro-drifter Ryan Tuerck goes up against the autonomous-driven DevBot for the Human + Machine Challenge. Will Tuerck’s grip skills dominate or will DevBot’s LIDAR and machine vision cameras prove to be too much?
[ Roborace ] via [ Engadget ]
Under the slogan “iintelligence 4.0_beyond automation,” KUKA demonstrated the automation solutions of the future at Hannover Fair 2018. The event had several world premieres from KUKA, including LBR iisy, our new, powerfully simple collaborative robot, and “i-do,” our concept consumer service robot. One of the highlights of the show was our Smart Factory, producing customizable robot model giveaways wrapped in an intuitive and easy user interface – demonstrating that Industry 4.0 doesn’t have to be complicated.
[ Kuka ]
Cruzr is the first customized service robot adaptable to a variety of business needs, optimizing human resources and improving work efficiency. Cruzr provides a new generation of service for a variety of industrial applications and domestic environments. Offering user-friendly, humanlike interaction, Cruzr provides customized AI business services. Each of its robotic platforms can be configured for a wide range of applications to a company’s specific needs for safe and easy access to virtually endless resources.
[ UBTECH Robotics ]
Drive.ai’s self-driving system operating in Texas without a driver behind the wheel.
[ Drive.ai ]
Ah, I think I know how you did that final scene. Clever.
[ Kamigami ]
In this video TIAGo is cleaning up a table. This includes hard to grasp objects like a spoon and a plate. The initial table setup was unknown beforehand. Dishes need to be placed in the sink (there was no dishwasher yet), other objects should remain on the table. Dirt (the paper) should have been removed.
[ University of Koblenz ]
This teaser video show reinforcement learning with TurtleBot3 in gazebo.
[ ROBOTIS ]
Here’s a new way of 3D printing objects using a robotic manipulator.
This paper presents a new method to fabricate 3D models on a robotic printing system equipped with multi-axis motion. Materials are accumulated inside the volume along curved tool-paths so that the need of supporting structures can be tremendously reduced – if not completely abandoned – on all models. Our strategy to tackle the challenge of tool-path planning for multi-axis 3D printing is to perform two successive decompositions, first volume-to-surfaces and then surfaces-to-curves. The volume-to-surfaces decomposition is achieved by optimizing for a scalar field within the volume that represents the fabrication sequence. The field is constrained such that its iso-values represent curved layers that are supported from below, and present a convex surface affording for collision-free navigation of the printer head. After extracting all curved layers, the surfaces-to-curves decomposition covers them with tool-paths while taking into account constraints from the robotic deposition system. Our method successfully generates tool-paths for 3D printing models with large overhangs and high-genus topology. We fabricated several challenging cases on our robotic platform to verify and demonstrate its capabilities.
[ Computational Design and Fabrication Lab TU Delft ]
We are getting closer to achieve robotic pollination! This is the first integrated test of BrambleBee in the greenhouse. Navigation, mapping, robot control, computer vision, and manipulation are all working together now. The hardware is also working great and check out our newly built structure in the greenhouse and nice blackberry plants! Once we worked out a reliable flower pose estimation solution and refined our end-effector design, pollination with real flowers (instead of QR flowers) will probably happen in Fall. Stay tuned!
[ WVU Interactive Robotics Laboratory (IRL) ]