Cloud Affords Clear View of Test Data
Clouds certainly have silver linings, at least when it comes to cloud computing. According to IHS iSuppli, cloud servers will enjoy robust growth in 2012 and move on to become the fastest-growing segment of the server industry within three years. The research firm predicts that, spurred on by the rising use of online storage and infrastructure systems to support services like Apple’s iCloud, shipments of cloud servers will reach 875,000 units in 2012, up 35% from 647,000 in 2011 and nearly double the 460,000 in 2010.
The firm anticipates growth rates ranging from 23% to 30% for each of the next three years, with cloud-server shipments hitting approximately 1.8 million units in 2015. The firm reports that the five-year compound annual growth rate for cloud servers beginning in 2010 is 31%—five times greater than the forecast for the total server market.1
As the cloud grows, you might find it a convenient place to store your data and access it from anywhere. But if you start looking into cloud computing, you might find your vision clouded by acronyms such as IaaS, PaaS, and SaaS. They stand, respectively, for infrastructure as a service, platform as a service, and software as a service. These three models of operation, represented by examples such as Amazon AWS, Microsoft Azure, and Salesforce.com, respectively, describe what a cloud service provider will offer you, what challenges you might face, and what your monitoring responsibilities will be.2
Handling the IT Details
One company that can help with cloud computing challenges is National Instruments, which has introduced the Technical Data Cloud (TDC). Michael Neal, LabVIEW product marketing manager, commented on the situation faced by engineers that led NI to develop the TDC: “Going back decades, NI has had customers who have been required by the necessities of their applications to roll their own solutions when it came to networking, storage, and sharing data. Whether it’s a manufacturing test system or remote monitoring of a machine to determine when to perform maintenance, the burden has been on engineering, often in partnership with IT, to come up with some kind of solution.”
Neal said that developing such a robust, fully featured, redundant solution is challenging and “really not what your typical electrical engineer or mechanical engineer went to school for or learned on the job. It’s really outside their area of expertise, which typically is building or testing products.” Collaboration with IT can present its own challenges, he said, in getting the teams to collaborate, establish budgets, and align goals.
In contrast, commercial availability of cloud computing has provided a new approach. Instead of customers trying to build their own solutions internally, now there are many advantages to moving to the cloud.
“Typically the cloud is going to give you a very dynamic environment to work in—meaning you can buy a little bit or a lot, whatever it is you need—without overspending,” Neal said. “You basically design your application to run on someone else’s servers in their data center,” with the added advantage that you can obtain the necessary resources quickly without having to make a capital investment and hire operations people.
Neal acknowledged that if you have some programming capability—you took a course in C or C# or Linux—you probably could hack something together without involving corporate IT to get your solutions finished. “That’s all well and good, but reality is you are still building your own solution, and you will be responsible for designing it, testing it, and maintaining it,” he continued. “And that’s something that is extremely complicated—writing code that runs in the cloud is really not the same as writing code that runs on your desktop computer—making sure that it scales properly, that your application doesn’t go down if there is an outage in a particular data center for the vendor that you’ve chosen, and that it’s able to be resilient and shifted to a different data center in a different part of the world seamlessly.
“Even though the hardware is maintained 24×7 by the vendor, your application is your own responsibility,” Neal continued. “If you are talking about a mission-critical part of your system, you can’t afford to have it go down. You have to find a way to staff 24×7 operations support for the software.”
Neal said NI’s goal was to help offload such responsibilities from its customers: “That’s where we moved in with our new service called TDC. We want to make it very easy for our customers to publish data to the cloud, primarily using LabVIEW (Figure 1), but it’s really totally open. It could be from a C application or from a Visual Basic application. But we made it easiest and most practical to do in LabVIEW.”
Figure 1. LabVIEW Code Snippet Illustrating the Use of the TDC API to Send Data to the Cloud
Courtesy of National Instruments
With the TDC, customers can send and retrieve data, and NI takes care of all the work behind the scenes. Said Neal, “We’ve written the application that runs on the cloud, collects the data, stores it, and sends it back out when requested. We staff the operational support for the TDC, we add features, and we perform maintenance. Our customers are able to leverage this emerging technology, and it becomes something they can put in their hands very quickly,” without having to provide their own IT expertise.
The TDC now runs on servers that are part of Microsoft Azure. Said Neal, “We’ve selected Microsoft as our vendor; Microsoft maintains all the security and physical hardware. But we designed the TDC so that if we decided tomorrow to switch to a different vendor we have the flexibility to do that.”
Reaching the Cloud
Neal said the users can create an account on the TDC and start sending data within 10 minutes, and he noted that storage costs are continually falling. But one aspect you’ll need to take into account is how you plan to get your data to the cloud. Prospective TDC users often are involved in remote-monitoring applications (monitoring snow load on the roofs of commercial buildings, for example) and may use a nearby cell tower as their Internet access point. If you’re sending gigabytes and gigabytes of data, you could find your AT&T bill to be thousands of dollars per month—much higher than your cloud computing bill.
In such cases, you would be well advised to reduce the amount of data you need to send to the cloud in real time. Neal suggested incorporating intelligence at your remote-monitoring node to process acquired data so that raw data need not be transmitted or at least not in real time. He cited as an example an in-vehicle test of an electronic control unit. The customer would send critical real-time data every five minutes over a 3G network. Raw data would be sent overnight when the test vehicle returned to a garage with Wi-Fi connectivity.
Semiconductor Test Data
Another company pursuing a cloud approach on behalf of its customers is OptimalTest, which focuses on the semiconductor supply chain, acquiring data from foundries and test houses and making it available to its fabless or fab-light customers.
OptimalTest has reported success in providing supply-chain visibility to support real-time data collection, secure data transfer, and data analysis for its customer Qualcomm.3 The Qualcomm approach does not employ a public cloud, but Danny Glotter, founder and CEO of OptimalTest, said the company is engaged in a pilot cloud-computing offering based on Amazon AWS.
Michael Schuldenfrei, OptimalTest’s chief software architect, said, “We are working on a next-generation solution that will include support for running our applications on the cloud.” That approach, he said, will be suitable for smaller customers whose production runs require just a small number of testers. Such customers, he said, “won’t need any IT infrastructure at all. We will have the data uploaded from the test houses to our cloud,” with OptimalTest handling functions such as data security, ensuring physical separation of different customers’ data within the cloud deployment. Added Glotter, “We did lots of work in that area to make sure we got the right network topology to guarantee security.”
Glotter noted that out of four fabless clients he has approached, two said they didn’t want to go to a cloud approach while two said they were very open to the cloud.
Test Equipment for the Cloud
The cloud provides a means for storing and accessing test data and a market for innovations in test equipment, an opportunity that test vendors intend to take advantage of. Ixia, for example, announced in February its participation in the first public test of a cloud infrastructure in conjunction with EANTC and Cisco.
In the EANTC-led testing, Ixia hardware test ports (in the form of Xcellon-Flex AP load modules) emulated millions of Internet or campus users in a next-generation IP network to test various public/private cloud scenarios. In addition, various network conditions in the wide area network were emulated using Ixia’s ImpairNet channel-impairment products.
“Public cloud networks are complex and evolving architectures that must continually be evaluated for optimum performance and security,” said Eddie Arrage, market development manager at Ixia, in a press release. “This EANTC demonstration of a Cisco-based design using Ixia’s cloud testing tools and solutions highlights the requirements needed for deploying a high-functioning and real-world-proof cloud network.”
Agilent Technologies, too, sees test opportunity in the cloud. Speaking at DesignCon, Ross Nelson, general manager for digital debug solutions in Agilent’s Electronic Measurement Group, said the nature of design and test is changing rapidly because of the increasing dominance of mobile devices and cloud computing. He said increasingly pervasive mobile-computing devices now are driving the development of new technology.
According to Nelson, mobile devices offer sufficient bandwidths via Wi-Fi, 3G, and 4G networks to place demands on the cloud servers that need to handle the data the mobile devices generate and receive—driving innovations in servers that enable the cloud. While end-user demand is shifting from client/laptop devices to smart mobile devices, he said, the cloud will continue to drive server-computing innovations.
The combined innovation shift driven by mobile and cloud innovations, Nelson said, “has changed the conversation around test-and-measurement needs to address designers’ debug requirements.” He added, “Innovation is needed in mobile computing and must be continued in server-computing to support growth of the cloud.”
Speaking in advance of the Optical Fiber Conference (OFC), Andreas Gerster, program manager photonic test at Agilent, said, “The trend toward cloud computing and services drives the importance of optical connectivity for enterprises and datacenters as well as backbone evolution.”
Figure 2. N4392A Optical Modulation Analyzer
Courtesy of Agilent Technologies
Specific instruments that Agilent introduced at OFC to address the test needs of cloud-enabling technologies such as optical connectivity include the N4392A, an ultra-compact, portable optical modulation analyzer with a 15-inch laptop-size screen (Figure 2). It is optimized for daily R&D work and designed for economical performance verification in manufacturing and component test for 40/100G components, modules, and systems.
Also at OFC, Agilent introduced the N4877A Clock Recovery Instrument and the N1078A O/E Converter, which provide instrumentation-grade clock-recovery capabilities for electrical communications signals operating at any rate from 50 Mb/s to 32 Gb/s and multimode and single-mode optical signals to 14 Gb/s and 32 Gb/s, respectively. The N1075A combines both instruments in a compact form factor. In addition, with up to a 67-GHz modulation frequency range, the new N4373D lightwave component analyzer supports S-parameter test of electro-optical components with improved usability and performance over its predecessors.
Beyond Storage
Clearly, the cloud offers many opportunities, whether you want to use it or provide the equipment that helps test it. The cloud won’t suffice for all applications. MIL/Aero companies, for example, will resist using the public cloud for mission-critical applications. And if your storage and processing requirements fit within a laptop and you don’t need to provide immediate worldwide access to the data you acquire, the cloud might not be for you.
But the cloud is a valuable tool that’s available if and when you need it. Said Neal at NI, “When we talk to customers, they are glad that we give them an option for storing data easily. We treat the cloud like every other off-the-shelf tool” such as a multicore processor or wireless a transceiver.
Most cloud applications now center on data storage, but that will change as compute power itself moves to the cloud. Already, said Neal, NI is offering a cloud approach for speeding the compilation of FPGA code. “What really gets customers excited,” concluded Neal, “is the idea that they can deploy their own custom intelligence in the cloud—that’s when the light bulb seems to go on. They can start seeing all the potential options they would have at a fraction of the cost of doing it themselves.”
References
- “Servers: When a Cloudy Day is a Good Thing,” IHS iSuppli Topical Report, Compute Platforms, Q4 2011.
- “Fair to Partly Cloudy: Ensuring Monitoring Clarity in Public Cloud Deployments,” White Paper, Nimsoft, 2011.
- Nelson, R., “Boosting Semiconductor Yield and ROI,” EE-Evaluation Engineering, March 2012,
p. 30.