From Data Center to End Device: AI/ML Inferencing with GDDR6
Aug. 4, 2020
Created to support 3D gaming on consoles and PCs, GDDR packs performance that makes it an ideal solution for AI/ML inferencing. As inferencing migrates from the heart of the data center to the network edge, and ultimately to a broad range of AI-powered IoT devices, GDDR memory’s combination of high bandwidth, low latency, power efficiency and suitability for high-volume applications will be increasingly important. The latest iteration of the standard, GDDR6 memory, pushes data rates to 18 gigabits per second and device bandwidths to 72 gigabytes per second.
The evolution of GDDR memory and how it is ideally suited for the needs of AI/ML inferencing
Application of GDDR6 to support ADAS and autonomous vehicles
Enabling interface solutions for implementing GDDR6 memory
Stricter guidelines imposed by version 3 of the IEC standard for harmonic current emissions push designers to embrace power-factor-correction methodologies.
The flyback topology is a versatile, widely used, switched-mode power-supply design with some interesting characteristics that brings performance and BOM advantages to many applications...