“The cabin doors are now closed. Please turn off all of your personal electronic devices… anything with a power switch… they must be completely powered off, not just in airplane mode...”
On a recent flight from San Jose, Calif., to Austin, Texas, – the so-called “nerd bird” flight because of the large number of technical professionals travelling regularly between Silicon Valley and Austin design centers—the aforementioned announcement gave me a chuckle.
It seems the U.S. Transportation Security Administration (TSA) has now determined that a power switch is one of the fundamental requirements for an “electronic device.” The TSA also has determined that “airplane mode” on these devices, which typically turns off all wireless transmission and reception, isn’t sufficient to mitigate whatever perceived threat is created by having even a few electrons flowing through these dangerous contraptions.
Interestingly, on every smart phone today, the “power switch” isn’t a switch at all. It’s merely a button that allows me to indicate to the device that I’d like the screen to be dimmed, or the processor to go to sleep, or the system to be (mostly) shut down. Power, like pretty much everything else on my phone, is controlled by the software.
The Decline Of The Power Switch
In fact, power switches have been losing their clout for quite some time. Since the days of the VCR with those blinking zeroes, real power switches on electronic devices have been on their way out. Even with devices powered “off,” their remote control sensing circuitry had to remain active, clocks and timers required energy, and stored information needed to be maintained.
Over the years, with increasing technology performance and shrinking feature sizes, we’ve seen geometric increases in device capabilities, along with dramatically lower energy consumption. More recently, audio and video have gone digital, and their corresponding appliances have shrunk to the extent that they are now just applications on my smart phone.
Their implementations are a combination of hardware and software, and the line between the two is both fuzzy and dynamic. Modules implementing specific functions are only active when needed, and then they revert to a deep sleep or are powered off completely. And all of this happens with no power switches in sight. (They’re actually still there, but now on-chip in the form of individual transistors that switch the power rails of a column or region of cells!)
While all of this presents a nice simple view to the user, the complexity of today’s devices is staggering. In addition to the expected technology improvements we’ve seen for the last 40 years in the electronics industry tracking Moore’s Law (the doubling of available transistor count every 18 to 24 months), we’re seeing an additional dimension of complexity introduced by the power architecture.
The previous assumption that all circuits are turned on when the chip is powered on is not just false, but in many cases, it is impossible. For many of today’s advanced chips, powering all circuits on at once could lead either to a catastrophic failure of the device due to excessive current consumed at power up or excessive heat generated during operation.
Many regions of the chip simultaneously running at many different voltages is the norm (decreasing operating voltage decreases energy consumption dramatically, but also impacts performance), with large areas being shut down and powered back up dynamically as needed.
The traditional optimization targets for digital design have been performance (speed) and area. Performance has been the traditional primary target because producing functional and reliable hardware depends on everything working at a prescribed speed target, and circuits that don’t meet the performance goal simply can’t be used in the system. The area requirement has been an indicator of device cost, as larger chips are more expensive to produce.
In the past decade, though, power has emerged as an additional target that has become as important as circuit performance, especially for mobile devices. Chipmakers have produced special lower-power versions of processors and support chips, with customized technologies, architectures, and performance targets, specifically to meet the low-power needs of mobile computing.
And in just the last few years, the explosion in popularity and functionality in smart phones and tablets has driven a large percentage of consumer-oriented chip designs into the realm of advanced low-power design.
This new complexity on the hardware side is introducing even more complication on the software side, where things were already notoriously difficult to verify. Now, software bugs can lead not only to unintended results or a crashed application, they also can put the device in an incorrect or illegal power state, resulting in dramatically lower battery life, device failure, or possibly even spontaneous combustion due to excessive current.
We’ve already seen many well-publicized instances of major mobile operating-system (OS) software releases that were quickly patched to “improve battery life.” The task of verifying a new mobile OS release in light of the expanding power architecture complexity is growing at a pace that is difficult to comprehend.
Previous ideas in software verification regarding code coverage and quality need to be updated to include concepts like, “What happens if a module, such as a GPS receiver, is powered down or cannot receive a reliable signal?” or, “If an error occurs, what are the power implications of a hardware wait or retry loop?” or, worse yet, “How do these additional problems impact concurrent applications and expand potential deadlocks?”
Trouble In My Phone
In my most recent smart-phone OS upgrade, I’ve definitely noticed occasional behavior that smacks of some kind of communications deadlock. The device sometimes hangs up while trying to connect with a server via the 3G data connection. Unfortunately, the radio is one of the subsystems that draw significant power, so the result is a hot device and a fully drained battery in a short period of time.
Of course, the problem isn’t easily reproducible, and doesn’t occur too frequently, so it could be related to anything from the specific group of applications being run at the time, to the orientation of the antenna on the device, or even the load of the network on the other end. These problems will continue to grow as we push the complexity upward. It’s a challenge to the industry to create better ways of verifying the entire system, including the firmware and operating software, to ensure correct functionality.
In this case, resetting the phone breaks the deadlock and allows normal operation after the communications subsystem is reinitialized. Setting the phone to “airplane mode” to turn off the radio, and then turning it back on, also solves the problem in most instances, but strangely, not always. Maybe that flight attendant knew what she was talking about after all!