Ml Edge 3 609edb922f10a

FPGAs and eFPGAs Accelerate ML Inference at the Edge

May 18, 2021
With ML models in their infancy, there is a need for flexible HW architectures that allow the acceleration of changing models. Learn why FPGAs, that combine high performance and flexibility, are an ideal solution for edge inference applications.

Many industries are rapidly adopting machine learning (ML) to gain insights on the ever increasing data from billions of connected devices. This, combined with a demand for low latency, drives a growing push to move inferencing hardware closer to the location where the data is created. This white paper will describe why FPGA-based hardware accelerators are required in order to eliminate network dependencies, significantly increase performance, and reduce the latency of the ML application.

Sponsored

Power Topologies Handbook

Buy ICs, tools & software directly from TI. Request samples, enjoy faster checkout, manage orders online and more with your myTI account.

A Designer's Guide to Lithium (Li-ion) Battery Charging

This designer's guide helps you discover how you can safely and rapidly charge lithium (LI-ion) batteries to 20%-70% capacity in about 20-30 minutes.

STM32WBA 32-Bit MCU Wireless Series

Certified for Bluetooth® Low Energy 5.3 protocol, STMicroelectronics' STM32WBA52 product series allows non-expert developers to easily add wireless communication to their devices...

What is the Most Effective Way to Commutate a BLDC Motor?

Brushless direct current electric motors, or BLDC motors for short, are electronically commutated motors powered by a dc electric source via an external motor controller. Unlike...