Latest from Embedded

144516710_Vladimir_Timofeev_Dreamstime
promo__id_144516710__vladimir_timofeev__dreamstime
ID 84308884 © Andy Chisholm - Dreamstime.com
promo_id_84308884__andy_chisholm__dreamstime
Dreamstime_Monsit-Jangariyawong_117103442
dreamstime_monsitjangariyawong_117103442
Tony Vitolo/Electronic Design
promo1920x1080
ID 83317721 © Igor Zakharevich | Dreamstime.com
supplychain_dreamstime_l_83317721
Dreamstime_sompongsriphet_152034055
65bbbaff7a5f68001ef23f04 Auto Dreamstime Sompongsriphet 152034055

As AI Takes Off, Chipmakers Pump Up Performance (Download)

Feb. 1, 2024

Read this article online.

It’s hard to miss the fact that AI is trending strongly across the industry. Demand for it is on the rise everywhere, including the edge, leading to upgrades of existing semiconductor products. Suppliers are targeting new data types and new instruction sets as well as introducing new products. 

Each of these initiatives aims to squeeze more computing, optimized for the needs of AI, from small devices that generally must use the bare minimum of power served up at a rock bottom price point. A big challenge is that inferencing used in more advanced AI depends greatly on large language models (LLMs) that require enormous computing resources for training. It also, for the most part, forces edge devices to rely on high-bandwidth communication with more powerful resources in the cloud or a data center, which can be inconvenient and power-intensive.