Robotics is a lot harder than it looks. But that's what makes it so challenging. Rolling, flying, and walking robots are hard enough to build. Now try creating a robot in human form. Yet the state of the art continues to improve by leaps and bounds, including a few robots that are starting to do just that.
The problems associated with humanoid robots are varied and many. That's why so many projects tackle only a subset of these functions without attempting anything as complex as the android Lieutenant Commander Data on Star Trek: The Next Generation.
In fact, challenges such as voice, obstacle and image recognition, emotional response, and eye-hand coordination are daunting tasks all by themselves. So when ST:TNG's Lieutenant Natasha Yar asks, "You are fully functional, aren't you?" the answer would be "Not yet."
Complex tasks like running or balancing on one foot are now common, though. These movements aren't completely lifelike, but they are smoother. High-performance processors, hardware such as digital cameras to provide situational awareness, and improved artificial intelligence (AI) are enabling researchers to create very lifelike robots. Just like high-end gaming systems, the results may be almost indistinguishable from the real thing within a constrained context.
The complexity of these robots is growing significantly with human robotic research. Systems typically have tens of degrees of freedom (DoF) as fingers and toes come into play for balance, interaction, and picking up the tab. This is especially true of robots that run or have facial expressions.
Robots Running Around
Honda's ASIMO (Advanced Step in Innovative Mobility) gets by with just 26 DoF (and 26 servos), so heading out for a jog is a typical affair (Fig. 1). A few of these ASIMO research vehicles cost millions.
Still, ASIMO has been a flexible development and research platform. Its hip joints allow it to pivot even while it's walking or running. This permits the robot to shift its center of gravity, which is very handy for biped movement.
Walking and running essentially are nothing more than controlled falling. Honda's research has yielded a smoother gait and more natural movement, as ASIMO uses algorithms that can look ahead and anticipate any necessary corrections. Honda calls this real time technology iWalk (Intelligent Walking). ASIMO can handle inclines over 30° as well as stairs of different sizes.
As a semiautonomous robot, ASIMO can take general directions (such as "run ahead 10 steps") much like NASA's Mars Exploration Rover, which receives general instructions about where to move to and then moves to the desired location on its own. ASIMO's master uses a wireless PC.
ASIMO is constructed of lightweight materials, including a magnesium alloy. Its backpack houses a 40-V nickel-metal-hydride battery and the main computer. It can operate for about half an hour. It's only 120 cm high and weighs 52 kg. And, it can run over 3 km/hour.
ASIMO's hands have an opposable thumb and four fingers that can grab small objects that weigh up to 1 lb. It can even shake hands. A pair of cameras and audio input and output are located behind its faceplate. It can handle image and voice recognition, though its level of autonomous response is limited. What Honda has accomplished is impressive, but ASIMO's simple faceplate is rather expressionless. It also provides opportunities for others to provide this type of support.
Making Robotic Faces
Attempts to get a robot to express itself have been going on for a while. Cynthia Breazeal and a host of graduate students in the Humanoid Robotics Group at the Massachusetts Institute of Technology Artificial Intelligence Lab have completed some groundbreaking work, starting in 1993 with Kismet (Fig. 2).
This anthropomorphic robot was designed to show emotions when it interacts with people. The Kismet project concentrated on the face. Its high-level perception, motivation, behavior, motor skill, and face motor systems ran on on four Motorola 68332 microprocessors running an application developed in L, a multithreaded Lisp environment. This setup was tied to a network of nine QNX-based PCs to handle speech recognition and analyze vocal intent.
Kismet had a 15-DoF face whose expressions could mirror emotional states such as happiness, sadness, or anger. Each ear had two DoF. And, the robot could manipulate its eyes, eyebrows, and lips.
Hanson Robotics is taking the process even further by delivering human expressions on a human-like face. These faces use the company's own low-cost Frubber material, which wrinkles, bunches, and generally moves like real skin.
One of Hanson Robotics' first projects was a lifelike, android portrait of the late science-fiction author Philip K. Dick, whose work was used as the basis for the movies Blade Runner, Minority Report, and Total Recall. The robot combines expressive robot hardware, natural language AI, and machine vision. Cameras in its eyes allow it to perceive people's identity and behavior using machine vision and biometric-identification software. It also can respond to people in its field of vision. This first project addressed only the robot's face.
Hanson Robotics' next project used a familiar face, Albert Einstein (Fig. 3). While its form looks more like the ASIMO than a real person, the body will provide mobility for a robot with a lifelike face that uses improvements garnered from the Philip K. Dick project.
Powerful, low-cost servo motors are only one key element of this type of solution. Its multiple processors and advanced application software also require large amounts of memory and sophisticated networking.
Buying Robots For Your Health
By keeping the problems of humanoid robots in perspective, many companies are now developing practical and economical solutions. Granted, their movement and user interaction may be less sophisticated than Honda's ASIMO or Hanson Robotics' Einstein. But many applications can benefit from a simpler mobile robot.
Mitsubishi Heavy Industries plans to release Wakamaru in Japan sometime next year for the home healthcare market (Fig. 4). The robot should be available for 1 million yen, or about $8300. Costs for this class of robots could drop further as volume increases and hardware and software improve.
Wakamaru was named after a young Japanese samurai, Minamoto Yoshitsune, whose childhood name was "Ushi-wakamaru." Both Wakamarus are associated with "growth" and "development."
The 63-lb, 3-ft fully autonomous robot has a round yellow head and black eyes. It has a built-in cell phone that can call for help if it detects an emergency. Its camera, microphone, and identification software can identify people through facial and voice recognition. It has 13 DoF, including two for its wheels. Rolling is still more power-efficient and stable than walking.
Wakamaru is powered my MontaVista Software's Linux running on Texas Instruments TMS320C6000 and TMS320C2000 DSPs. It will be able to make phone calls, send e-mails, and remotely track individuals. Researchers hope the software will be able to detect the daily rhythm of life as the robot speaks and interacts with people.
Although Wakamaru doesn't have an especially expressive face, it can turn its head and gesture with its arms to get similar results. Its cute form and color are designed to appeal to children as well as adults. It has about a 10,000-word vocabulary, and it should be able to keep track of up to eight people within a family or group.
Wakamaru does use some interesting techniques. For example, it has a camera that looks straight up. It uses the ceiling image to help track its movements throughout a house. This enables the robot to locate its charger when its batteries run low after about two hours of operation.
Robots are being built using the latest technology, from six-axis force sensors and gyroscopes for input to multiple core processors for faster response and more computing power. Advances in hardware have made lifesized robots possible, but the software will provide the human interaction.