Latest from Embedded

William Wong/Endeavor Business Media
promo__vita_93_qmc__william_wong
ID 316508515 © Alena Butusava - Dreamstime.com
Brainchip Platform Uses Spiking Neural Networks for Low Power Operations
Dreamstime_Eugenesergeev_215838205
dreamstime_eugenesergeev_215838205
76795646 © Cybrain | Dreamstime.com
promo_cybrain_dreamstime_xxl_76795646
ID 23149447 © Joe Sohm - ID 160236318 © Tea - Dreamstime.com
id_23149447__joe_sohm__id_160236318__tea__dreamsti
Dreamstime_sompongsriphet_152034055
65bbbaff7a5f68001ef23f04 Auto Dreamstime Sompongsriphet 152034055

As AI Takes Off, Chipmakers Pump Up Performance (Download)

Feb. 1, 2024

Read this article online.

It’s hard to miss the fact that AI is trending strongly across the industry. Demand for it is on the rise everywhere, including the edge, leading to upgrades of existing semiconductor products. Suppliers are targeting new data types and new instruction sets as well as introducing new products. 

Each of these initiatives aims to squeeze more computing, optimized for the needs of AI, from small devices that generally must use the bare minimum of power served up at a rock bottom price point. A big challenge is that inferencing used in more advanced AI depends greatly on large language models (LLMs) that require enormous computing resources for training. It also, for the most part, forces edge devices to rely on high-bandwidth communication with more powerful resources in the cloud or a data center, which can be inconvenient and power-intensive.