170145228 © Denis Kobzev | Dreamstime.com
aicomputing_dreamstime_l_170145228

Edge AI: Rewards are Matched by Challenges

Nov. 29, 2023
Intelligence and the edge—what could possibly go wrong? Plenty, it turns out, whether it’s power limitations or security concerns, among other issues.

This article is part of the TechXchange: AI on the Edge.

What you’ll learn:

  • Potential risks and problems associated with edge AI development and deployment.
  • Ideas on how to achieve success.
  • Where to go for more ideas and insights.

 

AI and the edge are made for each other.  But that doesn’t mean it’s easy to make edge AI work. In fact, numerous potential “gotchas” can derail an edge AI initiative.

The edge, a huge physical and logical space at the periphery of the enterprise, including the mobile and vehicular world, is a frontier being explored and exploited more than ever. And, of course, AI has been the biggest buzzword of the last few years. Combining the two can certainly make lots of sense since AI has the potential to make the edge more independent of central control and more useful, too.

Edge AI usually involves applying algorithms that can make decisions and predictions in near-real-time. The general challenge has been to get compute-intensive AI to succeed within edge resource constraints. So, it isn’t something to undertake casually.

One of the rationales for the edge is that local processing reduces latency as well as bandwidth needs and compute loads at a data center. So, the promise of AI at the edge is that it can make edge computing even more effective by pushing decisions closer to the data.

But edge AI can quickly bump up against realities that make it harder than might be expected. (For a look at the related topic of “tiny machine learning” or TinyML, click here.)

Set Your Expectations, Have a Plan

Before diving into edge AI too deeply, spend some time thinking about what you want to accomplish. Is the right kind of data available to support that goal and what kind of processing will be needed and, therefore, what kind of hardware or cloud resources would likely be required? 

With some of those ideas organized, you can begin to make some rough calculations about the cost, feasibility, and potential payback of implementing edge AI.

Powering the Edge

Power is a more important consideration at the edge than it would be in the cloud or in a data center. It may be that an AC power source is “dirty,” e.g., subjected to variations or noise due to nearby industrial activities like welding. Or it may be limited by existing wiring or lack thereof.

In the case of the many edge activities, such as simple temperature and vibration sensors that can function well for long periods on battery power, adding local AI hardware and software can be thwarted by the much greater power needs they bring.

So, efficiency is a necessity and options that can keep power consumption low are important. Extensive deployment of edge AI will likely require a review of the system-wide power architecture.

Options for implementing power conservation include low-power chips, hardware accelerators to make processing more efficient, and power-management systems that can optimize power use for specific goals.

Compute and Memory Needs for the Edge

Recognizing how AI functions in a computing environment is far more important with the resource constraints of edge. For instance, many mainstream microprocessor CPUs consume a great deal of power in accomplishing iterative-rich inference processes. This architecture also tends to run slower than might be desired. Hardware accelerators or even GPUs can help improve performance while reducing power consumption. Systems that can “sleep” when not processing also save energy.

Faced with limitations of existing AI hardware, NIST researchers are exploring alternatives that use new technologies such as Spintronics for Neuromorphic Computing.

The reality of edge—limited power, etc.—usually means that hardware options for running AI processing are limited, too. But vendors such as TI offer embedded processors that can run AI algorithms. When “low-code” software is added to the mix, the compute and power challenges can become more manageable. And these types of processors also offer good performance. For example, TI’s AM62Ax, aimed at battery-powered systems, operates at 2 teraoperations per second (TOPS).

Single-chip microcontrollers designed for IoT systems often combine a general-purpose CPU, SRAM, non-volatile memory (NVM), and I/O functions. However, performance limitations generally confine these devices to basic inference AI software.

Systems-on-chips (SoCs) that include accelerators can potentially deliver more performance. NVM doesn’t always go along for the ride due to chip geometry limitations and costs.

If costs aren’t critical, AI performance can be achieved with a two-chip approach including some form of NVM.

Software Side of the Edge

The limits of edge hardware and the need for efficiency has encouraged the creation and adoption of lightweight algorithms and coding. Training, an important element in many AI/ML deployments, is a time-intensive process that can be especially frustrating in an edge environment.

Developing the AI model that will be deployed on the edge device involves training the model using a dataset that’s representative of the use case, and then deploying the model to the edge device. If realistic training can be accomplished in another environment, it can save time and allow for more convenient and thorough testing.

Finally, commercially available tools can help create a “low-code” or even “no code” environment for developing and testing AI models. Some provide guidance to those new to AI programming.

Into the Cloud

Latency is always a concern with edge AI, which often makes cloud options less compelling. Nonetheless, all major cloud players offer edge AI options including AWS, Microsoft Azure, and IBM.

AI’s Role in Security Efforts?

In general, the edge has often been where security concerns arise. Physical access to the edge (depending how it’s defined) can be hard to control, potentially allowing bad actors to access or potentially even tamper with edge AI.  Data can be lost through physical access as well as network and device communication means.

On the other hand, AI can provide more intelligence and insight at the edge. It could assist security efforts and, more importantly, reduce the amount of data needing to be handled and protected elsewhere in the enterprise.

Still, there are plenty of risks to think about.

Loss of personal data can be a significant concern in some edge AI use cases. For example, customer data can be captured and used at the edge, including by AI. Ensuring that AI handles this data in accordance with company practice and regulations, such as GDPR and California Consumer Privacy Act (CCPA), is essential, particularly given the value attached to this data by hackers who might target edge sites.

External attackers can brew up other kinds of trouble for edge AI, too. One of the most serious involves an attacker injecting incorrect or extraneous data or interrupting a desired source of data. The motivation is mostly malicious, resulting in things like manufacturing errors or, with vehicle controls, potentially leading to injury. This technique can also be used by an attacker to learn about how a system operates to enable further, more sophisticated attacks. These “inference attacks” may reveal sufficient data to reverse-engineer systems or products.

Finally, of course, edge systems can be hit by the same kind of DDOS attacks and viruses that affect other systems. On top of that, insiders unfortunately can continue to be a source of problems.

Read more articles in the TechXchange: AI on the Edge.

Reference

Artificial Intelligence-Assisted Edge Computing for Wide Area Monitoring,” NIST.

Sponsored Recommendations

The Importance of PCB Design in Consumer Products

April 25, 2024
Explore the importance of PCB design and how Fusion 360 can help your team react to evolving consumer demands.

PCB Design Mastery for Assembly & Fabrication

April 25, 2024
This guide explores PCB circuit board design, focusing on both Design For Assembly (DFA) and Design For Fabrication (DFab) perspectives.

What is Design Rule Checking in PCBs?

April 25, 2024
Explore the importance of Design Rule Checking (DRC) in manufacturing and how Autodesk Fusion 360 enhances the process.

Unlocking the Power of IoT Integration for Elevated PCB Designs

April 25, 2024
What does it take to add IoT into your product? What advantages does IoT have in PCB related projects? Read to find answers to your IoT design questions.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!