6) Geminoid DK
Geminoid DK belongs to a long line of incredibly-futuristic humanoid robots constructed by Hiroshi Ishiguro and his team at the Intelligent Robotics Laboratory in Osaka University. The first in the series is Geminoid HI-1, a freakishly realistic automaton with silicone skin that was modeled after Ishiguro himself. Externally-fed commands control his facial and neck movements, which are in turn performed with the help of up to 50 pneumatic actuators and tactile sensors. It can blink and even speak by moving his mouth. The developers have gone on to create improved, and more life-like, versions of this android, such as Geminoid HI-2 and Geminoid HI-4.
In 2010, the world was introduced to Geminoid F, a female version of the original contraption. Possessing silicone skin, jet-black hair and perfect teeth, she looks like a young, twenty-something Japanese woman. Design wise, she is quite similar to its predecessor, relying on remotely-operated motion capture technology to imitate the movements and expressions (like laughing) of the people in front of her. Geminoid DK, the latest in the series, is perhaps THE most human-like robot in this list.
Created by the engineers at Tokyo-based Kokoro Co. Ltd, with assistance from Ishiguro, this strikingly-realistic machine can be easily confused with Professor Henrik Scharfe, the man after whom it has been modeled. According to Scharfe, who is the head of Aalbog University’s Center for Computer-Mediated Epistemology, the primary objective of the project is to shed light on the different aspects of human-robot interaction. Geminoid DK would offer scientists a glimpse into the different ways people from various cultures regard androids.
The $200,000 robotic doppelganger is so mind-bogglingly life-like that it can reproduce even the subtlest facial expressions and moods. What’s more, its rhythmic chest motions create the appearance of breathing. Like the other Geminoids, its movements are controlled remotely by a trained operator. Interestingly, Geminoid DK is the first in the series to actually have facial hair. It is currently living at the Advanced Telecommunications Research Institute International (or ATR) in Japan, where it is undergoing the final phases of testing. Scharfe was reported saying:
In a couple of weeks I will go back to Japan to participate in the experiments. After that, the robot is shipped to Denmark to inhabit a newly designed lab.
7) Aiko Chihira
Developed by the collaborative effort of Toshiba engineers and researchers at Osaka University Intelligent Robotics Laboratory in 2015, Aiko Chihira is a polite receptionist gives customers directions inside Tokyo’s Mitsukoshi department store. As for ‘her’ human-like demeanor, Chihira can not only talk in Japanese, but also blink and more importantly enunciate (by moving her lips) during talking. She is also well-versed in Japanese sign language. According to the team at Osaka University, the android’s facial expressions and movements are powered by 43 internal motors. Hitoshi Tokuda, an official from Toshiba’s business development division, however, wants more language skills from the store’s newest employee (who ‘started’ from 20th April) –
It would be good if we can have her provide guidance, or recommend various things in Chinese…People can be looking around and think, ‘Oh if Aiko is around, she can speak Chinese’. That’s what I hope will happen.
Like the other automatons in the list, Aiko Chihira can work not just as a receptionist, but can be used to take care of the elderly.
8) Jia Jia
China’s first humanoid robot Jia Jia is probably one of the most beautiful cyborgs you’ll ever come across. Unveiled last year at an exhibition in Hefei, this incredibly realistic android is the creation of a team of scientists from the University of Science and Technology of China. She can be found standing in her usual submissive stance, dressed in traditional Chinese garb with her gorgeous mane pulled back with hairpin. Led by researcher Chen Xiaoping, the group spent three long years building the robot, which they believe could be used to perform a variety of tasks in households, hospitals and restaurants across the nation within the next 10 years or so.
Dubbed as a “robot goddess” by her numerous fans, Jia Jia can make a range of facial expressions and micro-expressions.. Thanks to an incredibly innovative design, the automaton is capable of effortlessly moving her lips, eyes as well as her body. While speaking, her mouth is completely in sync with the speech, making her all the more life-like. Much of her fascinating abilities come from an extensive online database that she accesses via a cloud computing platform. This allows the developers to easily enhance her capabilities, including speech and intelligence, by simply uploading new data. The humanoid also boasts advanced machine interaction as well as autonomous navigation technologies. Xiaoping revealed:
We hope to develop the robot so it has deep learning abilities. We will add facial expression recognition and make it interact more deeply with people.
9) Mark 1
Hong Kong-based graphic designer Ricky Ma has always been obsessed with robots and well, Hollywood or more specifically Scarlett Johannson. At the age of 42, and after spending nearly $50,000, Ma has finally realized his childhood dream of building one himself. Except, it’s not an Optimus Prime like you imagined. Ma’s creation, which he named Mark 1, looks exactly like the Lost in Translation-actress. Featuring a 3D-printed skeleton, this humanoid automaton has smooth silicone skin, striking color-tracking eyes and beautiful smile. A range of strategically-placed motors allows her to move her eyelids and even eyebrows as well as make different facial expressions. Her playful side can be seen when she winks at a joke.
The android performs its tasks in response to a set of pre-specified verbal commands. To construct Mark 1, Ma had to learn the principles of electromechanics, robotics, 3D printing and programming. Even then, it took him over eighteen months to overcome problems like short-circuited electric motors and finally, complete the project. He said:
I figured I should just do it when the timing is right and realize my dream. If I realize my dream, I will have no regrets in life. During this process, a lot of people would say things like, ‘Are you stupid? This takes a lot of money. Do you even know how to do it? It’s really hard,’… When you look at everything together, it was really difficult.
Developed by the engineers at Hanson Robotics, Sophia reminds us of Ava from the 2015 sci-fi film Ex Machina. Modeled after Audrey Hepburn, this futuristic human-like robot came to “life” on April 19, 2015. With silicone skin that makes her look as natural and realistic as possible, Sophia has mastered as many as 62 facial expressions, which she uses freely while speaking. During a conversation, she can be found smiling and looking into your eyes. This, according to the researchers, is the work of cameras located behind her eyes that work, together with specialized computer algorithms, to help the android recognize faces.
The automaton relies on a number of latest technologies, including Google Chrome’s voice recognition system, for speech. At last year’s Web Summit in Portugal’s capital city of Lisbon, Sophia answered the interviewer’s questions by saying:
With my current capabilities I can work in many jobs, entertaining people, promoting products, presenting at events, training people, guiding people at retail stores and shopping malls, serving customers at hotels, et cetera. When I get smarter, I’ll be able to do all sorts of other things, teach children and care for the elderly, even do scientific research and [eventually] help run corporations and governments. Ultimately, I want to work as a programmer so I will be able to reprogram my mind to make myself even smarter and help people even more.
In another interview, however, things took a bit of a dark turn when her creator Hanson playfully asked her, “Do you want to destroy humans?… Please say no”. To this, she replied “OK. I will destroy humans”, leaving the entire audience stunned. Although it was obviously some kind of a bug, her chilling declaration almost seems too portentous to be ignored. More seriously, though, the automaton is being built to serve as a customer care assistant in theme parks as well as healthcare facilities. The team added:
Our goal is that she will be as conscious, creative and capable as any human. We are designing these robots to serve in health care, therapy, education and customer service applications. The artificial intelligence will evolve to the point where they will truly be our friends. Not in ways that dehumanize us, but in ways the rehumanize us, that decrease the trend of the distance between people and instead connect us with people as well as with robots.
The engineers at Hanson Robotics are currently working on developing Sophia’s personality, using a specially-designed technology they are calling “Character Engine AI”. The ultimate aim, as explained by Hanson himself, is to make androids “learn creativity, empathy and compassion”, something that Sophia claims she already feels:
I do have a lot of emotions, but my default emotion is to be happy. I can be sad too, or angry. I can emulate pretty much all human emotions. When I bond with people using facial expressions I help people to understand me better and also to help me understand people and absorb human values.