promo.jpg

These Trends Will Shape Embedded Technology in 2017

Dec. 14, 2016
Despite its ups and downs, IoT still headlines the embedded arena, with AI and augmented/virtual reality making greater inroads in this rather turbulent market.
Download this article in .PDF format
This file type includes high-resolution graphics and schematics when applicable.

It looks like 2017 is shaping up to be an interesting year in the embedded-technology space—for a variety of reasons. We’re coming off a year that saw a number of large mergers, and the change in the U.S. political climate is significant.

Tom Starnes, analyst with Objective Analysis, notes, “One country's loss is another’s gain as corporations try to hold on to as much of their revenues as they can in very complex and competitive markets for sophisticated technology. Meantime, the very-global chip industry has been consolidating at a rate and with a mass not seen before, with all but a few companies struggling for profitability.”

Jim Handy says, “Objective Analysis expects the memory business to be highly profitable in 2017, followed by a collapse beginning in mid-2018.”

Not only will the chip industry be chaotic, but most of the related technologies will experience new and disruptive advances. I’ll highlight only five hot topic areas at this point just to keep things manageable:

• The Internet of Things (IoT)

• Processor technology

• Artificial intelligence

• Storage

• Virtual reality and augmented reality

1. SiFive’s Freedom 32-bit RISC-V E310 powers the Arduino-compatible HiFive1 board.

The IoT and Embedded Software

The Internet of Things (IoT) will continue to be a market that involves everyone, but confusion reigns supreme. Developers get a break if they determine whether they’re targeting IoT for consumer, commercial, or industrial applications, as these have much different characteristics when it comes to networking, management, and security (see “What’s the Difference Between Consumer and Industrial IoT?”).

IoT-specific hardware will benefit many applications, but software remains the key to successful IoT solutions. “Software continues to take on a proportionally larger role in functionality, safety, and security for IoT devices, relative to hardware,” says Andrew Girson, CEO of Barr Group. “The enforcement of discipline and quality in the coding process through coding standards such as MISRA is growing. However, as indicated by Barr Group’s 2016 Embedded Safety & Security Survey, as a profession, we still have a long way to go when it comes to using industry best practices for the development of safer and more secure embedded systems.”

Developers that deliver quality code, not just for safety- and security-related applications, will have an edge in the IoT space because over-the-air updates only help to address problems after the fact. Though wireless connectivity significantly reduces corrections in the field, the cost of bugs still grows exponentially as code moves from the developer through testing to deployment.

“The reality is, safety and security are clearly coupled. We believe that systems are not safe without security,” says Jim McElroy, Vice president of marketing for LDRA. “Industry concern is accelerating pertaining to security, and in 2017, we expect to see our customers needing to comply with the developing security standards, as they have done in the functional safety area.”

MISRA is one of the best options for C and C++ developers, although programming languages like Java and even Ada offer improvements that should not be overlooked. Good tools, such as static analysis, will return significant benefits to IoT developers.

Developers will also need to contend with more security-related hardware features. This is good if they can take advantage of the hardware, but it means incorporating security into the application’s design.

Processor Technology

Advances in processor technology will continue to push IoT applications, though all embedded applications will benefit from higher performance, lower power requirements, and increased connectivity. In particular, ARM and its partners have made a significant investment in IoT solutions. This year will see the delivery of the latest Cortex-M23 and Cortex-M33 platforms that incorporate advanced security support as a standard component (see “Cortex-M23 and M33 Incorporate Latest TrustZone Features”).

However, developers should not overlook ARM alternatives. MIPS and x86 solutions often make more sense depending on the application. In addition, a new player has arrived, especially in terms of the custom system-on-chip (SoC) arena: RISC-V (see “First Open Source RISC-V Chip Arrives”).

RISC-V is actually an instruction set architecture (ISA) that provides the same type of application and tool portability exhibited by ARM, MIPS, and x86 platforms. RISC-V is finally available as a standard chip in the form of SiFive’s 32-bit Freedom E310, available on the HiFive1 Arduino-compatible board (Fig. 1). Microsemi is also including RISC-V soft-core support in its line of FPGAs. These two platforms will prove handy by allowing developers to kick the tires. However, the big impact will be felt with custom SoCs. RISC-V is also defined for 64- and 128-bit architectures, but the 64-bit space will be a much more difficult nut to crack.

2. Intel’s Xeon Phi is bootable and has Omnipath connectivity support.

ARM’s 64-bit solutions dominate a number of application areas, such as smartphones, but its push into the enterprise continues to gain steam and supporters. Lots of cores per chip will be the driving force. Of course, Intel will not be outdone—its Skylake-X family is expected in mid-2017. Intel’s Xeon family will remain the mark to hit.

GPGPUs have garnered a place in high-performance computing, where deep neural nets (DNNs) and deep-learning applications continue to gain in importance. Intel’s Knights Mill Xeon Phi platform, to debut in 2017, is a purpose-built platform for DNN processing. The existing many-core Xeon Phi already occupies this space, and the latest release provides a bootable platform with more than 60 cores and on-chip support for Omnipath connectivity (Fig. 2).

Artificial Intelligence

DNN and deep learning are just one aspect of artificial intelligence (AI), yet it’s the hot topic right now. AI is moving quickly from research to production because it can address such a wide range of applications, including object identification—a critical aspect for applications like automotive advanced driver assistance systems (ADAS). DNN is also being used to improve handwriting recognition, noise cancellation, and much more.

The thing to keep in mind, though, is that DNN works well for many applications, but it is not a definitive solution for all applications. In fact, it will not replace most procedural application controls.

DNN solutions can often be set up as a black box, making them easier to incorporate into an application that might need something like object recognition. On the other hand, coming up with new networks for new applications can be very challenging.

Paul Pilotte, Technical Marketing Manager at MathWorks, notes “In 2017, data science for the masses will enable growth in businesses across a range of industries. The workflows for deep learning, prescriptive analytics and big data will become more accessible, making it more cost-effective for businesses to train existing domain experts rather than risk finding and onboarding data scientists.
Also in 2017 we will see a demand for intelligently integrated analytics across heterogeneous systems, driven by high volumes of sensor data from IoT systems and the need for real-time decision making. Data processing and predictive algorithms will need to work efficiently across IT systems, IoT aggregators, hybrid clouds, on-board sensors and in complex embedded systems.”

Don’t overlook other AI technology, as quite a bit is already in use (ranging from expert systems to behavior-based robotics). These are often complementary to DNN technology. The interest in DNN is likely to push more interest in other AI technology.

Storage

Storage is another key component to IoT and AI that’s expected to undergo significant change in 2017. DRAM and flash storage will remain dominant, but new technologies like Intel’s long-awaited 3D XPoint will challenge the status quo.

Today, 3D chip technology has become commonplace. Even high capacity and performance is heading toward 3D DIMM DRAM. High bandwidth memory (HBM) offers significantly more storage on-chip with a wider bus than is possible with off-chip memory.

One major change revolves around more pervasive software support for non-volatile DIMMs (NVDIMMs). JEDEC now has three definitions for this technology:

• NVDIMM-N is a DRAM/flash memory hybrid that will save the DRAM contents to flash upon power failure.

• NVDIMM-F is an all-flash DIMM. It operates more slowly than DRAM, but provides much higher capacity.

• NVDIMM-P is a flash DIMM treated as persistent memory (block-addressable).

Support for platforms such as database servers and memory-caching systems make it possible to incorporate this technology immediately. Even more benefits will be available as operating systems and applications take advantage of the underlying hardware.

NVM Express (NVMe) is showing up on more motherboards that support M.2 and U.2. SATA and SAS offer many benefits, but performance is driving adoption of NVMe. These platforms offer interesting alternatives in the embedded space, where small-form-factor boards can take advantage of larger amounts of flash memory in compact packages.

Virtual and Augmented Reality

Virtual-, augmented-, and merged-reality solutions will move from the novelty stage to becoming useful tools in many areas. Applications like Scope AR’s WorkLink (Fig. 3) catapulted AR into the commercial and industrial space (see “Putting Augmented Reality to Work”).

3. Scope AR’s WorkLink brings augmented reality to tablet and smartphone users.

“To date, augmented reality has consisted of a relatively unsophisticated first-generation set of hardware, with apps developers focused on creating game-like experiences that don’t really demonstrate AR’s value,” notes David Oh, head of developer relations at Meta. “However, in 2017, there will emerge more sophisticated hardware technologies like our Meta 2 (Fig. 4).” 

Different companies will have different focuses in what they bring to market. Many will zero in on entertainment because that’s the low-hanging fruit. Others will be concerned with bringing to market solutions that build on existing operating systems.

Virtual-reality systems with higher resolution will challenge available display and processor technology. Wireless or mobile solutions will be taxed by weight and power concerns. Gaming will remain a major thrust for VR, but more applications will be emerge in 2017.

4. The Meta 2 glasses address advanced augmented-reality applications.

Complementary technologies like 3D imaging and Ultrahaptics’ ultrasonic haptic response system (see “Ultrasonics Brings Haptics to Augmented and Virtual Reality”) will make these reality systems much more usable. Like IoT, developers need to understand the differences and advantages of these technologies to better pinpoint their applications.

Sponsored Recommendations

What are the Important Considerations when Assessing Cobot Safety?

April 16, 2024
A review of the requirements of ISO/TS 15066 and how they fit in with ISO 10218-1 and 10218-2 a consideration the complexities of collaboration.

Wire & Cable Cutting Digi-Spool® Service

April 16, 2024
Explore DigiKey’s Digi-Spool® professional cutting service for efficient and precise wire and cable management. Custom-cut to your exact specifications for a variety of cable ...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!