(Image courtesy of Google).
Google Tensor Processing Unit

Google Hires Intel CPU Veteran in "Doubling Down" on Server SoCs

March 30, 2021
Amazon, Microsoft, and Google are increasingly open to replacing off-the-shelf server chips largely designed by Intel in favor of internally-designed chips that are tailored to their needs.

Google is hiring a former top executive from Intel's design engineering division to head up a new server chip division, adding to the technology industry's efforts to reduce its dependence on Intel.

The technology titan said that it is "doubling down" on internally-designed server chips in a bid to improve the energy efficiency and performance of the data centers that run its cloud services. To give a booster shot to its efforts, Google announced last week it hired Intel engineering executive Uri Frank to lead a newly established division that will design and develop custom server chips.

With the Frank's hiring as its vice president of engineering for server chip design, Google is tearing a page from the playbooks of its larger rivals in the cloud computing market. Amazon has been developing Arm-based server chips and rolling them out over its AWS cloud service for the last three years. Microsoft is reportedly also working on Arm-based CPUs for data centers.

“Google has designed and built some of the world’s largest and most power efficient computing systems,” Frank said in an announcement of his new role . “For a long time, custom chips have been an important part of this strategy." Frank, who brings more than 20 years of experience in CPU design to Google, said he looks forward to building out the new server processor division.

Amin Vahdat, who serves as Google's VP of systems infrastructure, said that the company is assembling "world-class" engineering team in Israel. He said in a blog post that the unit would focus on working with customers and partners to make systems-on-a-chip (SoCs) for servers, expanding on the wide range of other server chips developed by Google over the last half decade.

In 2015, Google introduced its first tensor processing unit (TPU) to speed up artificial intelligence chores in its data centers and, in 2018, said it would allow other companies to buy access to the chips through its cloud computing service. Then, in 2019, the company launched its OpenTitan SoC, its open-source and ultra-secure chip that is designed to serve as a system's root of trust.

Google also designs its own hardware for data centers ranging from network switches to network accelerator cards that consist of its in-house chips and parts it sources from Intel and other firms.

The company wants to away from buying all of these components from different vendors and wiring them together on a motherboard. Instead, it is trying to employ more systems-on-a-chip (SoCs), which integrate the CPU and other computing modules on the same die, or system-in-package (SiP) designs, which combine different chips in a single package, in the data center. 

“The motherboard has been our integration point, where we compose CPUs, networking, storage devices, accelerators, memory, all from different vendors, into an optimized system,” said Vahdat. “But that’s no longer sufficient: to gain higher performance and to use less power, our workloads demand even deeper integration into the underlying hardware."

By consolidating more of these functions in a single slab of silicon or in a single package, he said that Google could improve the latency and bandwidth between different components by “orders of magnitude." He said these types of server chips are very tightly integrated, giving them "greatly reduced" power and cost compared to the discrete components on motherboards in use now.

"The SoC is the new motherboard," Vahdat said.

Google, Amazon, Microsoft and other cloud service providers (CSPs) are continuously upgrading the servers and other hardware in their data centers as global demand for cloud services grows. They burn through billions of dollars on server chips largely designed by Intel, which has around 90% of the market for server CPUs, which cost thousands to tens of thousands of dollars each. 

But they are also increasingly turning to custom chip designs that are better suited to their needs, promising boosts in performance-per-dollar over off-the-shelf server processors. While Microsoft is reportedly working on Arm-based server chips to deploy on its Azure cloud service, Amazon last year rolled out new cloud services—or "instances"—based on its custom Arm-based Graviton CPU.

The cloud computing giant said that it could develop the core building blocks of its SoCs and SiPs for data centers, following in the footsteps of its rivals. But it is also not against buying intellectual property (IP) from outside vendors where it makes sense. Ultimately, it could develop its a custom server CPU core from scratch or pay for the underlying blueprints from Arm or another company.

Vahdat said Google's server chip strategy is focused on flexibility. "We buy where it makes sense, build it ourselves where we have to, and aim to build ecosystems that benefit the entire industry."

Developing a server-class processor takes a very long time, and it could be three years or more before the new engineering unit starts deploying custom server chips on Google's data centers. For now, Google is putting its best foot forward. Uri Frank brings 24 years of experience in CPU, SoC and IP product development over from Intel, where he co-led the client engineering group. 

"Together with our global ecosystem of partners, we look forward to continuing to innovate at the leading edge of compute infrastructure, delivering the next generation of capabilities that are not available elsewhere," Vahdat said.

Sponsored Recommendations

What are the Important Considerations when Assessing Cobot Safety?

April 16, 2024
A review of the requirements of ISO/TS 15066 and how they fit in with ISO 10218-1 and 10218-2 a consideration the complexities of collaboration.

Wire & Cable Cutting Digi-Spool® Service

April 16, 2024
Explore DigiKey’s Digi-Spool® professional cutting service for efficient and precise wire and cable management. Custom-cut to your exact specifications for a variety of cable ...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!