Electronic Design

What's All This Forecasting Stuff, Anyhow?

Well, I am going to make a few comments about "forecasting." I will primarily observe that most efforts at forecasting are a waste of time. I remember when one of the CEOs of NSC in the old days (NOT Charlie Sporck) forecast that the semiconductor industry would never have any more downturns because all the business is well controlled. For the last couple of years, the semiconductor business hasn't generated any downturn. But when our customers' customers stopped buying things, and then our customers stopped buying our products, our business did turn down. We still made some profits, but our customers were sick, and we also caught a cold. Now we seem to be recovering. But I can't say we predicted—or caused—the upturn. (Our CEO Brian Halla claims he predicted the upturn for June 21, 2003. Maybe so.)

Roosters seem to crow to forecast the dawn. But if the sun doesn't come up, they can't force it to. Predicting, wishing, and causing something to happen are three different animals.

I can cause some things to happen (not very many), and I could wish for others. That doesn't necessarily make them happen. About 10 years ago, I predicted that somebody would generate hydrogen from sunlight by putting a magic potion into a bucket of water and pouring it over a cookie-sheet full of sand. When sunlight is applied, hydrogen would be generated! I'm not going to say this will never happen, but I refuse to bet on it happening any particular time soon.

Can we forecast that Moore's Law will fail? This has been debated many times. But recently, Intel decided to make microprocessors that are more effective in terms of hours of work per milliampere-hour, rather than just a huge number of megahertz of clock rate. I mean, how many of you can type at 200 MHz? 800 MHz? 2100 MHz? 2 MHz? 0.2 MHz? 0.02 MHz? It's great for Intel to focus on a better amount of computing (and typing) per watt-hour of battery life.

The late Frank Goodenough, a well known Electronic Design editor, and I once were at a power conference where people were predicting wise ways to slow down the processor's clock and cut power dissipation when there is no need for high speed. Ain't it about time it really happens? Great. But the guys who said that this was a good idea couldn't make the sun come up any quicker. Still, because I read about this in the International Tribune today, I guess this isn't a real forecast. I'm just happy we're heading in the right direction.

It's often thought that Yogi Berra said, "Predicting is very hard, especially about the future." But nuclear physicist Niels Bohr said that. I don't know if Yogi Berra ever said that, or Casey Stengel, but neither one probably said it first.

A well-reputed pundit and seller of "forecasts" at a major stock market recently admitted (anonymously) in a major financial newspaper that most of his "forecasts" were useless as soon as they were published. Why would I doubt this? Why would I have to wait for them to be published for them to be worthless?

Can I foresee that analog circuits aren't going to dry up and blow away, as many digital guys were predicting for dozens of years? Yeah, but that's a cheap guess.

Comments invited!
[email protected] —or:

Mail Stop D2597A, National Semiconductor
P.O. Box 58090, Santa Clara, CA 95052-8090

TAGS: Intel
Hide comments


  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.