The Raspberry Pi Tackles HPC With 750-Unit Cluster

Nov. 14, 2017
Researchers at Los Alamos National Laboratory have combined 750 Raspberry Pi systems to create a low-budget system to test HPC software.

You may not think of the Raspberry Pi as a candidate for high performance computing. But if you pack enough of them into a chassis, you can create an inexpensive, low-energy system to test software for deployment on petascale systems.

That’s what researchers at Los Alamos National Laboratory have done, working with HPC vendors BitScope and SICORP to build a cluster of 750 Raspberry Pi systems. They say the system, which is being demonstrated at this week’s SC17 conference, can save enormous amounts of money for researchers testing HPC applications.

“It’s not like you can keep a petascale machine around for R&D work in scalable systems software,” said Gary Grider, leader of the High Performance Computing Division at Los Alamos National Laboratory, home of the Trinity supercomputer. “The Raspberry Pi modules let developers figure out how to write this software and get it to work reliably without having a dedicated testbed of the same size, which would cost a quarter billion dollars and use 25 megawatts of electricity.”

The Raspberry Pi is a credit-card sized computer that can be connected to a keyboard and TV to do just about anything a typical desktop computer can do. It was developed by the Raspberry Pi Foundation in the United Kingdom to help spread computing in education and in developing countries. The price for a basic model starts at about $25 and consumes only a handful of watts of power.

Seeking a cost-effective solution to the challenges facing HPC systems software developers, Grider said, he “suddenly realized the Raspberry Pi was an inexpensive computer using 2 to 3 watts that you could use to build a several-thousand-node system large enough to provide a low-cost, low-power testbed to enable this R&D.” But he was unable to locate a suitable densely packaged Raspberry Pi system on the market.

“It was just people building clusters with Tinker Toys and Legos,” said Grider, who turned to SICORP of Albuquerque, N.M., to collaborate on a solution. Then they jointly worked with BitScope of Australia to develop easily scaled rack-mounted units.

The BitScope system consists of five rack-mounted Pi Cluster Modules, each with 150 four-core nodes of Raspberry Pi ARM processor boards. They are fully integrated with network switching infrastructure. With a total of 750 CPUs or 3,000 cores working together, the system gives developers exclusive time on an inexpensive but highly parallelized platform for test and validation of scalable systems software technologies.

“Having worked with Raspberry Pi for quite some time, I’ve long thought it the ideal candidate to build low-cost cloud and cluster computing solutions for research and education,” said Bruce Tulloch, CEO of BitScope. “When SICORP approached us with Gary’s plans, we jumped at the opportunity to prove the concept.”

Check out the SICORP web site for additional details.

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

Conduit Sweeps and Elbows for Data Centers and Utilities

Data Centers and Utilities projects require a large number of electrical conduit sweeps and elbows. Learn why Champion Fiberglass is the best supplier for these projects.

Prefabricated Conduit Duct Banks Enable Smooth and Safe Electrical Installation for a Data Center

Prefabricated conduit duct banks encourage a smooth, safe electrical conduit installation for a data center.

Oselote/Shutterstock.com
Source: Oselote/Shutterstock.com

The Future of Data Center Cooling: Embracing Emerging Technologies while Maintaining Air Cooling

Danielle Rossi, Global Director – Mission Critical Cooling for Trane explore the benefits of hybrid data center cooling designs and the importance of embracing emerging technologies...

White Papers

Get the full report.

The IT Leader’s Ultimate Guide to Building a Disaster Recovery Strategy

Aug. 10, 2022
In this white paper, Flexential outlines five basic steps for creating a robust, reliable and tested disaster recovery plan that’s tailored to your business.