Ever Googled the term “robotics”? The only major company and product that show up in the top 100 hits are Microsoft and its Robotics Studio, a development tool that leaves a lot to be desired. (Not one of the dozen or so folks interviewed for this article uses it.) In fact, most of the search results include news, collegiate research, education, and events.
Yet in 2006, Korea’s Ministry of Commerce, Industry, and Energy predicted the global intelligent robotics market would reap nearly $90 billion by 2015 (up from a couple billion in 2005) with a growth rate of 57%1. A more recent study conducted in January by ABI research (www.abiresearch.com) indicates that the personal robotics market (including toy robots like Sony’s Aibo and task-based robots like the iRobot’s Roomba) will reach $15 billion by 20152.
Now if that’s not exciting enough for potential entrepreneurs, two things stand out as absolute truths when it comes to the technologies that enable robotics. First, many experts say that there will indeed be a robot (think humanoid, not a glorified vacuum) in every home one day. Second, the missing elephants in the room are already planning their rendezvous to capture what should amount to billions in revenues.
That’s right. The Microsofts and Intels of the world are looking at the future and trying to make sure their lunch hooks are in the kitchen ahead of time, eagerly planning to snatch whatever morsel becomes available by assimilating robot technologies into “the collective.”
No mere mortal company can beat the established semiconductor and software giants in place now. But what if you want to start a company in robotics with only one exit strategy: join the gang after it throws some dough in your general direction? What technologies that empower robotics should you focus on to get noticed by the big bosses—and not wind up getting stomped into submission?
Most experts would agree that four viable technologies and research areas would be good starting points: cheap sensors, a solid application programming interface (API), inexpensive kits, and artificial intelligence. “Let’s say we have a mobile, safe, intelligent robot for personal use. But if the cost is $100,000, would anyone buy it? The cost of the electromechanical components used in a robot is still very expensive \[with respect to\] sensors, actuators, etc.,” says Dennis Hong, director of the Robotics & Mechanisms Laboratory (RoMeLa) at Virginia Tech.
“We have seen this in the early ’70s with the personal- computer revolution,” adds Hong. “Unless the component costs drops down, personal robotics as a business won’t be able to succeed. iRobot’s Roomba is probably the only success story I can think of.”
SENSIBLY PRICED SENSORS
Unless you’re working on an R&D team for a major corporation or university, you probably get nauseous just thinking about the price of some of the sensors required for many robotics applications, especially if they require one or more laser-based sensors. Mobility is one of the primary driving factors.
“Field robots (outdoor robots) need to go over rocks, hills, bumps, across bushes, etc., for them to be useful (bomb disposal, search and rescue, scientific exploration). If it cannot reach its goal, it’s no use. For personal robots (home, indoor use), even though the environment is more structured, it still needs to climb steps, etc.,” says Hong.
“The \[robotics\] industry needs real-world robust sensors that are affordable,” says Dave Barrett, an Olin College associate professor of mechanical engineering and director of the school’s Senior Consulting Program for Engineering (SCOPE).
This comes as no surprise if we look at last year’s DARPA Urban Challenge. Driverless cars used robotic technologies to travel 60 miles in six hours or less on an urban course while obeying traffic regulations and dealing with other traffic and obstacles. Seven of the 11 finalists used Velodyne HDL-64E light detection and ranging (LIDAR) sensors, costing around $75,000 each (Fig. 1).
However, sensors like these may be required if the auto industry is to move forward with building cars that drive themselves, saving around half of the nearly 42,000 lives per year lost to traffic accidents that are caused by human error, according to a statement made at January’s International Consumer Electronics Show by Sebastian Thrun, co-leader of the Stanford University team that placed second in the 2007 DARPA Urban Challenge.
Continue on Page 2
Very few people could afford a car with even a single Velodyne HDL-64E sensor attached. So how do we get to a point where cars that drive themselves are created with a consumer’s (and not a DARPA Challenge) pocketbook in mind? According to Barrett, what’s really needed is a vision system with a good color camera and a powerful computer behind it.
Also, according to Todd Dickey, a lead engineer for the Honda Advanced Step in Innovative Mobility (ASIMO) project, sound detection is getting a lot of notice right now. “In a large environment where there is a lot of ambient noise, it is difficult to determine what \[sounds are\] being directed at the robot,” says Dickey.
So Honda is spending beaucoup dollars in researching how a robot can determine sound that’s directed at it versus other sounds. For humans, this may be as simple as making eye contact or some using other gesture followed by dialogue. For robots, at least for the time being, more may be required, such as calling the robot by its name.
Researchers at Honda are looking into higher-resolution and more accurate cameras, too. According to Dickey, the latest information out of Japan indicates that the ASIMO can quickly distinguish people and items and react accordingly.
Speaking of Japan, Dr. Hiroshi Ishiguro of Osaka University offered his thoughts about where improvements are needed: “Actuators are very different from human muscles,” he says, noting that there’s a “need for more humanlike muscle actuators. To do so, actuators need to be made more linear and should not rely on reduction gears.”
But sensors aren’t the only hardware lacking in robotics technologies. William Lovell, CEO of c-Link Systems, says that “silicon-wise, there are not many motor drivers out there, with STMicroelectronics being the most predominant player.”
He also indicated that c-Link would like to see more H-bridges, as the company currently builds its own. “It would be nice to have off-the-shelf H-bridges with temp monitoring,” among other features, says Lovell.
And then, of course, there’s the software.
WANTED: ROBOTICS API
When it comes to robotics, companies are re-inventing the same mousetrap over and over again, especially on the software side. But it doesn’t take a robot scientist to note that much of this re-invention business stems from the lack of any sort of decent robotics API.
Isn’t it feasible to capture the description of robotic tasks, such as movement and rotation, in a well-written (and hopefully committee- based open-source) API? Surely, the devil is in the details. But why do companies like iRobot need to develop all of their code in house?
Couldn’t tasks such as mapping an area and object avoidance be broken down into a set of functions that, if well coded, apply to diverse areas and environments? According to Honda’s Todd Dickey, the need for a good robotics API is especially urgent in industrial robotics, where each software package and user interface is different.
On this topic, Anu Saha, the academic product marketing engineer at National Instruments (NI) says, “There are no standard tools, and people are rolling their own code in C and C++. What is needed is a tool that speaks the language of motors \[in the form of\] a human robotic interface (HRI).”
Continue on Page 3
NI also believes a design environment that works with FPGAs, such as its own LabVIEW product, is desirable since FPGAs are so very handy when it comes to developing robots. “There are some gaps, but it is a good start at getting all of the components working together,” says Saha, describing LabVIEW’s role in robotic development.
“If you look at the spectrum of software, there is a lot of academic \[code\] out there written in C/C++, and it is convoluted,” says Barrett. Robotics is obviously interdisciplinary, and some mechanical engineers wouldn’t give a hoot about C/C++, so a strong design environment is also desirable. “Products like LabVIEW provide an environment in which mechanical engineers can prosper, not struggle,” he says.
“LabVIEW bridges the gap between block diagrams and reality,” Barrett adds. While Microsoft offers Robotics Studio, the software is not quite on par with Lab- VIEW, so there are plenty of opportunities for smaller companies to be snatched up by Microsoft.
STOP KITTING AROUND
Robotics, like golf, can be an incredibly expensive hobby, with many of the DARPA challengers looking at a minimum of a million dollars just to get their foot in the door. Therefore, driving the cost of robotic kits down is a rather important goal.
Even the Lego Mindstorm NXT robotic kit makes an expensive stocking stuffer, priced at about $250 retail at lego.com. Still, this very nice learning tool for children ages 10 and up includes some rather sophisticated features, including four sensor types (light, sound, touch, and ultrasonic vision) and three interactive servo motors.
The kit also boasts an Atmel AT91SAM7S256 microcontroller based on the 32-bit ARM7TDMI RISC processor; an Atmel ATMEGA48 combination 4-kbyte flash, 512-byte SRAM; a 256-byte EEPROM; and an eight-channel, 10-bit analog-to-digital converter (ADC) (Fig. 2 and Fig. 3). For more information on the Mindstorm kit, see “The Mind Of Mindstorms” at www.electronicdesign.com, ED Online 16149.
We can’t even seem to get a general computing platform to do what we want it to most of the time. While robot scientists have made impressive progress over the years, much more work needs to be done. In fact, according to Dickey, “AI is the most heavily researched field in robotics today.”
Ishiguro and his research team at Osaka University’s Department of Adaptive Machine Systems are attempting to apply cognitive research to facilitate human-like behavior in robots. According to Ishiguro, this is by far the most difficult aspect of developing a humanoid robot.
“The timeline on the software is very challenging \[to facilitate\] mimicking human behavior,” says Ishiguro. Yet he feels his team can develop “something reasonable” within the next five years or so. But even coming close to perfecting this is more like 50 years off, a figure both Honda and Ishiguro agree upon (see “To Be Almost Human Or Not To Be, That Is The Question,” ED Online 14763).
“Unless it is teleoperated, the robot needs to be smart. Even if teleoperated, it needs to show some partial autonomous behavior to make it useful,” says Hong. “The DARPA Urban Challenge was a good example of addressing the challenge of developing intelligent robots. The robot cars needed to drive by themselves, negotiating the urban traffic and following all the rules. CMU (Carnegie Mellon), Stanford, and Virginia Tech are the three leading teams in this effort.”
ELEMENTARY, MY DEAR WATSON
The year 2015 and its potential for revenue aren’t as far away as they seem, so now is the time for today’s youth to get involved in robotics. However, very few colleges and universities offer bachelor’s, master’s, or PhD programs specifically in robotics.
Worcester Polytechnic Institute offers a BS in robotics engineering. Georgia Tech offers the first interdisciplinary robotics PhD. And then there are other programs, such as Carnegie Mellon’s minor in robotics as part of an engineering degree. After that, there are a few tracks and classes, and then the crickets are chirping.
Continue on Page 4
That’s why programs such as Dean Kamen’s For Inspiration and Recognition of Science and Technology (FIRST), which has had an incredible impact, are so important (Fig. 4). FIRST now involves more than 150,000 students ages 6 to 18 and 44,000 mentors competing in nearly 40 countries (see “Young Engineers Need You!” at ED Online 18106 and “Team Awkward Turtle Takes Second At FIRST” at Drill Deeper 18476).
DANGER, WILL ROBINSON
We know that robotics will play a big part in the future, but it’s unclear how it will unfold. Who is going to step in to bring the cost down? The U.S. federal government seems a likely candidate.
Many other governments have shown the willingness to pay to get the cost down through research and development activities, in much the same way the cost of GPS technology was tamped down over the years. For example, the Japanese government has coughed up nearly $42 million so far for the first phase of a humanoid robotics project.
Companies like Honda and other large corporations are developing robotic technology in house and plan on outsourcing their technology to third parties once they work out all the kinks. So, the future looks bright for robotics. There’s only the question of who will step up and deliver.
1. “Hyundai Heavy Eyes Big Chunk of Global Robot Market,” www.kois.go.kr, Oct. 19, 2006
2. “Personal Robot Industry to Grow to $15 Billion by 2015,” www.gizmag.com, Jan. 2, 2008
3. “Japan looks to a robot future,” news.yahoo.com, March 1, 2008