We’re not yet capable of building humanoid robots that are indistinguishable from biological humans, but that doesn’t mean we’re not trying. Here are 10 real robots that are helping us achieve this futuristic milestone.
To create the “perfect” humanoid robot, a machine has to exhibit three fundamental qualities: It needs to look, move, and act like a human. There isn’t a single robot in existence that hits all of these marks, and we’re still decades away from arriving at the uncanny valley for androids.
That said, there are robots that perform or exhibit at least one or two of these criteria quite well. Eventually, roboticists will pool together their expertise, producing the first generation of super-realistic humanoids. Here are ten that are taking us closer to that lofty goal.
Developed by Boston Dynamics (with, ahem, a little help from the US Department of Defense), this absurdly realistic humanoid robot is being used to test the performance of protective clothing. Sensors in PETMAN’s artificial skin can detect any chemicals that are leaking through the suit, and its high-tech skin simulates human physiology inside the outfit by producing sweat and regulating temperature.
This suit will eventually be worn by human emergency workers, so PETMAN is being used in simulations of real world conditions—and to dramatic effect. Unlike previous versions of PETMAN, this new-and-improved model can balance itself and move freely, performing such tasks as walking, bending, and anything else required of a rescue worker or soldier. PETMAN is truly awesome—and super realistic—but it’s also freaky as hell.
Described as a trilingual android, Junko Chihira is currently in development at Toshiba. Unlike PETMAN and many of the other robots on this list, she isn’t the most agile android in the world, but she’s got incredible interaction skills, along with the ability to make human-like facial expressions. She’s currently stationed at a tourist information centre on Tokyo’s waterfront, where she greets visitors in Japanese, English, and Chinese.
Junko Chihira incorporates Toshiba’s speech synthesis technology, which enables her trilingual skills. Her developers would like to give her speech recognition technology later this year, allowing her to respond to questions from tourists. Currently, the only way to interact with her is through a keyboard.
The SCHAFT Bipedal Robot
Getting robots to walk steady and confidently on two feet has been a tremendous challenge for researchers. A bipedal robot called SCHAFT shows that for some tasks, a torso isn’t required. This sturdy, stocky robot is already performing some meaningful work, but it could eventually lead to more agile humanoid robots.
SCHAFT’s latest bipedal creation. (Still: YouTube/mehdi_san)
Not much is known about SCHAFT, a Japanese robotics startup that was acquired by Google in 2014 and is now part of the company’s experimental technology lab. After a three-year hiatus, SCHAFT introduced the unnamed and unusual looking bipedal robot at NEST 2016 in Tokyo.
The machine is designed as a low-cost, low-power, compact device that’s meant to “help society.” Seems vague, but this bot can carry 60 kg, travel over uneven terrain, and handle difficult stairs—a tremendous challenge for robots. Whether or not this machine can still perform these tasks with an upper torso and head remains to be seen, but as this robot shows, some tasks don’t require certain body parts.
Erica and Geminoid DK
Erica is the brainchild of Hiroshi Ishiguro, director of the Intelligent Robotics Laboratory at Osaka University in Japan. Ishiguro is famous for his super-realistic humanoid robots (including his doppelganger, Geminoid HI-4), but Erica, in addition to looking very human-like, is designed to interact naturally with her human companions by integrating a number of skills, such as voice recognition, human tracking, and natural motion generation.
Endowed with 19 degrees of freedom (a degree of freedom is a single physical movement, such as the twist of the neck, or the lifting of an arm), Erica can move her face, neck, shoulder, and waist. She speaks through a synthesised voice, and can make several facial expressions and gestural movements.
Geminoid DK is another Ishiguro-designed robot, and it’s an effort to conquer the uncanny valley. Introduced in 2011, the bot was constructed to look like roboticist Henrik Scharfe of Aalborg University in Denmark. Geminoid DK cost $200,000 (£159,872) to design and build, but it shows. Watching Geminoid DK for the first time, it took me a while to realise I was actually looking at a robotic face. The hyper-realistic robot is being used to study our emotional responses after seeing an android that looks just like a real person.
Voted the robot most likely to destroy humanity, this DARPA-funded machine recently underwent a major upgrade in which 75 percent of it was rebuilt. Called ATLAS Unplugged, it’s more energy efficient, stronger (uh-oh), more dexterous, and quieter than its clunky predecessor (you won’t be able to hear it coming during the robopocalypse). And
best of all most frightening of all, it doesn’t require that silly safety tether.
The 6-foot-2 (1.88 metres), 156.6 kg robot is now equipped with a new battery pack (which it wears on its back), allowing for onboard energy storage and greater efficiency. ATLAS Unplugged has three onboard perception computers that it uses for perception and task planning. A wireless router in its head allows for untethered communication. Mercifully, it’s also equipped with a kill switch should this behemoth run off on its own.
Eventually, a future version of ATLAS could be used as a humanoid helping hand on the battlefield, or as a rescue worker in dangerous situations. Some day, when a truly realistic robotic human is finally built, we’ll look back on ATLAS as an important precursor.
Developed by researchers at Nanyang Technological University in Singapore, Nadine is a social robot that integrates artificial intelligence with super-realistic physical features to dramatic effect. Nadine uses natural hand gestures and head movements during conversation, and her mouth moves when she talks (albeit not that well). She’s a prime example of how AI and robotics will converge to create something distinctly human-like.
Modelled after the department’s director, Nadia Magnenat Thalmann, Nadine smiles when greeted and looks people straight in the eye during conversations. Incredibly, she uses facial recognition software to remember people she has met and can even recall prior conversations. Nadine can be happy or sad depending on what’s being said, and express her own distinct personality and emotions. Nadine is powered by software similar to Apple’s Siri or Microsoft’s Cortana, and she could eventually be used as a personal assistant in domestic or office settings.
REEM-C is a prototype humanoid robot developed by Spain’s PAL Robotics. At 5.4 feet and 80 kilos, its shape is loosely based on human proportions, but it’s packed with some incredible features.
REEM-C’s head is capable of two degrees of freedom (DOF), and is equipped with a stereo camera, LEDs to represent the mouth, and speakers to talk. Its arms, with seven DOF, allow it to hold 10 kilos above its head. Its human-like hands have 3 DOF, and is equipped with pressure sensors for haptic feedback. REEM’s 6 DOF legs can propel it 1.5 km per hour. In total, this robot has 22 degrees of freedom—impressive by any measure.
For its brain, REEM-C has a pair of i7 computers running Ubuntu software. The robot’s sensors helps it navigate through its environment, avoiding obstacles and people. Its designers envision it as a domestic robot, or any other number of applications, including a tour guide, entertainer, or security guard. REEM-C is still in the prototype phase, and it’s still a bit clunky, but given its wide array of features.
Romeo. (Image: Aldebaran Robotics).
Romeo, a 55-inch-tall humanoid robot, was designed by France’s Aldebaran Robotics to help individuals, such as the elderly, who have lost their physical autonomy. Romeo’s size and physical capabilities allow him to open doors, climb stairs, and reach objects on a table. His developers are hoping that he’ll eventually be able to carry objects, including people.
The Romeo project currently involves five institutions, 13 robotics labs, and 80 engineers and researchers. Romeo isn’t the most realistic robot on this list, but his physical movements are top notch—and even a bit eerie. The way he moves his arms and closes his hands is near perfect, giving the impression that there’s an actual child controlling the movements from the inside.
This aquatic android is one of the more innovative robots we’ve seen in a while, capable of swimming to depths that are traditionally off limits to human divers.
Developed at Stanford University’s Artificial Intelligence Lab, OceanOne is equipped with sensitive hands that relay haptic feedback to the navigator’s controls, allowing for a shared sense of touch. The robot was designed to study coral reefs in the Red Sea, where conventional autonomous underwater vehicles (AUVs) are at risk of damaging the delicate seafloor structures. Because it looks like a human, and because it’s controlled by actual human movements, it can analyse the corals with great care and delicacy.
img src="http://i.kinja-img.com/gawker-media/image/upload/s--3XiNrS0m--/c_scale,f_auto,fl_progressive,q_80,w_800/k1rrlqrb413t1fgbiqyx.jpg" alt="" width="800" height="533" />Image: Osada/Seguin/DRASSM
But OceanOne can do more than investigate coral. During its first mission, the robot dove for treasure in a shipwreck off the coast of France, and working at a depth of 100 metres. It actually managed to recover a grapefruit-sized vase and return it to the deck of its ship. OceanOne is missing the lower half of his body, but robots such as this can help us integrate human features and movements into other robots. Equipped with artificial intelligence and the requisite physical skills, future versions will be able to act without any human intervention.
Last but not least there’s ASIMO—Honda’s iconic humanoid robot. ASIMO may be turning 17 later this year, but this teenage android has got a lot going for it, thanks to a steady stream of improvements.
ASIMO is lighter and bit smaller than its predecessors, allowing it to move around with added grace and agility. The bot can pick up a sealed container filled with juice, unscrew the top, pick up a cup with its other hand, pour the juice, and carefully place both the cup and container back on the table. To make this possible, Honda has equipped ASIMO with sensors in its hands allowing it know that it’s holding something, and to know how much it weighs. With Honda behind its development, ASIMO continues to be one of the most technologically advanced robots on the planet.
Honourable Mention: DRC-HUBO
Developed by Team KAIST from South Korea, this robot managed to win the 2015 DARPA Robotics Challenge in Pomona, California. DRC-HUBO managed to beat out 22 other robots, winning the $2 million grand prize, but its “transformer-like” ability was—in our opinion—a form of cheating, and not very human-like.
We excluded DRC-HUBO from our list, as remarkable as it is, because a significant portion of its success stems from the fact that it can move in very non-human like ways. Specifically, it’s equipped with wheels on its knees, allowing it to overcome many of the challenges faced by true bipedal robots.
During the competition, the robot completed all eight tasks in the shortest amount of time, such as opening a door or operating a drill, while avoiding many of the catastrophic falls experienced by its competitors. With it’s knee-wheels, it’s not a true humanoid robot, but it’s still an impressive feat of engineering.
Taken together, it’s clear that roboticists are slowly but surely overcoming the technological hurdles required to construct a believable humanoid robot. Eventually, a robot will be developed that passes the so-called Turing Test for a number of capabilities, whether they be physical or psychological. As to whether or not these machines will be conscious, emotional, and self-reflective in the ways that humans are, that’s another question entirely.