Dreamstime_sompongsriphet_152034055
auto_dreamstime_sompongsriphet_152034055

As AI Takes Off, Chipmakers Pump Up Performance (Download)

Feb. 1, 2024

Read this article online.

It’s hard to miss the fact that AI is trending strongly across the industry. Demand for it is on the rise everywhere, including the edge, leading to upgrades of existing semiconductor products. Suppliers are targeting new data types and new instruction sets as well as introducing new products. 

Each of these initiatives aims to squeeze more computing, optimized for the needs of AI, from small devices that generally must use the bare minimum of power served up at a rock bottom price point. A big challenge is that inferencing used in more advanced AI depends greatly on large language models (LLMs) that require enormous computing resources for training. It also, for the most part, forces edge devices to rely on high-bandwidth communication with more powerful resources in the cloud or a data center, which can be inconvenient and power-intensive.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!