Artificial Intelligence 4736369 1920

The AI Evolution Calls for Adaptable Inferencing Platforms

Dec. 22, 2020
Deep learning's demand for computing power is growing at an incredible rate and is doubling every three months. Learn why the adoption of FPGA-based adaptable inferencing platforms are required for the next generation of AI system design.

Deep learning's demand for computing power is growing at an incredible rate, accelerating recently from doubling every year to doubling every three months. Increasing the capacity of deep neural network (DNN) models has shown improvements across a wide range of areas ranging from natural language processing to image processing. Learn why the adoption of FPGA-based adaptable inferencing platforms are required for the next generation of AI system design.

Sponsored

The Flyback Power-Supply Architecture and Operation

The flyback topology is a versatile, widely used, switched-mode power-supply design with some interesting characteristics that brings performance and BOM advantages to many applications...

What's the Difference Between AC-DC and DC-DC Power Supplies?

Significant complications arise when trying to get your power supply working from a wall outlet.

Factor PFC Into Your Power-Supply Design

Stricter guidelines imposed by version 3 of the IEC standard for harmonic current emissions push designers to embrace power-factor-correction methodologies.

5 LDO Regulators with High Output Current

High output current, also referred to as Iout, might be an important spec for your low-dropout (LDO) linear regulator.