Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
SAN FRANCISCO : Microsoft has designed two additional infrastructure chips for its data centers that will help speed artificial intelligence operations and increase data security, it said on Tuesday at its Ignite conference.
Microsoft has devoted significant resources to develop home-grown silicon for general purpose applications and artificial intelligence. Like rivals Amazon.com and Google, Microsoft’s engineers say there is a performance and price benefit to designing chips that are customized for its needs.
Designing custom chips can reduce Microsoft’s reliance on processors made by Intel and Nvidia.
Microsoft’s two new chips are designed to be installed deep within the company’s data center infrastructure. One chip is designed to increase security and the other is for data processing.
The company makes the effort to design an array of data center processors because it aims to “optimize every layer of infrastructure” and ensures that Microsoft’s data centers crunch information at the speed AI requires, said Rani Borkar, corporate vice president, Azure Hardware Systems and Infrastructure
Engineers will install the new security chip called the Azure Integrated HSM in every new server destined for a data center beginning next year. The chip aims to keep crucial encryption and other security data inside the security module.
The data processing unit, or DPU, aims to move multiple components of a server into a single chip that is focused on cloud storage data. The company said it can run these specific tasks at three times less power and four times the performance compared with its current hardware.
Microsoft also announced a new version of a cooling system for data center servers that relies on liquid to reduce the temperature of nearby components. The cooling unit can be used to support large-scale AI systems.