Microsoft Accelerates its Brainwave AI with Intel FPGAs

May 8, 2018
At its Build conference, Microsoft said Project Brainwave, its deep learning acceleration platform harnessing Intel FPGAs, has been integrated with Azure Machine Learning for the cloud as well as edge computing.

Brainwave has been coming. Now it’s here, and ready to bring new artificial intelligence computing tools to the cloud and the edge.

Last year we introduced you to Project Brainwave, a deep learning acceleration platform from Microsoft harnessing Intel FPGAs (Field Programmable Gate Arrays) to accelerate cloud and AI workloads. Today at its Build 18 conference, Microsoft announced a preview of Project Brainwave integrated with Azure Machine Learning, which the company says will make Azure the most efficient cloud computing platform for AI.

Microsoft says Project Brainwave represents a “major leap forward” in cloud-based deep learning performance, and can make real-time AI calculations at competitive cost and with the industry’s lowest latency.

“I think this is a first step in making the FPGAs more of a general-purpose platform for customers,” said Mark Russinovich, chief technical officer for Microsoft’s Azure cloud computing platform.

FPGAs are semiconductors that can be reprogrammed to perform specialized computing tasks, allowing users to tailor compute power to specific workloads or applications. FPGAs can serve as coprocessors to accelerate CPU workloads, an approach that is used in supercomputing and HPC.

Microsoft also is announcing a limited preview to bring Project Brainwave to the edge, meaning customers could take advantage of that computing speed in their own businesses and facilities, even if their systems aren’t connected to a network or the Internet.

“We’re making real-time AI available to customers both on the cloud and on the edge,” said Doug Burger, a distinguished engineer at Microsoft who leads the group that has pioneered the idea of using FPGAs for AI work. “These new capabilities will allow the integration of AI into real-time processes to transform businesses with the power of Microsoft Azure and Microsoft AI.”

A Showcase for Intel’s FPGA Technology

As America’s technology titans invest in heavily in AI capabilities to add intelligence to their products and services, they are also driving a cycle of hardware innovation.

“The era of the intelligent cloud and intelligent edge is upon us,” said Satya Nadella, CEO, Microsoft.

AI requires significant computing horsepower, including the use of hardware accelerators to supplement the work of traditional CPUs. This has created opportunities for NVIDIA and other makers of graphics processing units (GPUs), along with an ecosystem of startups developing custom chips optimized for AI.

Microsoft has been distinctive among the cloud titans for its focus on FPGAs as the innovation engine for its AI acceleration. It has also invested in testing ARM servers in the cloud, along with GPUs from NVIDIA.

Microsoft’s adoption of FPGAs is good news for Intel, which invested heavily in FPGA technology with its $16 billion acquisition of Altera in 2016.

“We are an integral technology provider for enabling AI through our deep collaboration with Microsoft,” said Daniel McNamara, corporate vice president and general manager of the Programmable Solutions Group at Intel Corporation. “AI has the potential for a wide range of usage scenarios from training to inference, language recognition to image analysis, and Intel has the widest portfolio of hardware, software and tools to enable this full spectrum of workloads.”

Intel is also developing custom AI hardware through its 2016 acquisition of Nervana, which has expertise with ASICs (Application Specific Integrated Circuits) that are highly tailored for machine learning.

Adapting to Fast-Moving AI Requirements

The public preview of Project Brainwave comes about five years after Burger, a former academic who works in Microsoft’s research labs, first began talking about the idea of using FPGAs for more efficient computer processing. Project Brainwave’s hardware design can evolve rapidly and be remapped to the FPGA after each improvement, keeping pace with new discoveries and staying current with the requirements of the rapidly changing AI algorithms.

Microsoft is using Intel Stratix FPGAs as the hardware accelerator in its Brainwave platform. Microsoft has described its approach as using a “soft” DNN processing unit (or DPU), synthesized onto commercially available FPGAs. Microsoft says this approach provides flexibility and the ability to rapidly implement changes as AI technology advances.

“By attaching high-performance FPGAs directly to our datacenter network, we can serve DNNs as hardware microservices, where a DNN can be mapped to a pool of remote FPGAs and called by a server with no software in the loop,” Burger explained in our DCF coverage last August. “This system architecture both reduces latency, since the CPU does not need to process incoming requests, and allows very high throughput, with the FPGA processing requests as fast as the network can stream them.”

This video from Microsoft provides additional details on Project Brainwave and a project with Jabil.

Explore the evolving world of edge computing further through Data Center Frontier’s special report series and ongoing coverage.

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...