Tulsa’s Community Collaboration Model for Supercomputing

71

Two weeks ago the Tandy Supercomputing Center in Tulsa, Oklahoma launched as the home to one of the country’s first shared, publicly available supercomputers.

The project — born of a collaboration between The University of Tulsa, Oklahoma State University, The University of Oklahoma, Tulsa Community College, the city of Tulsa, business owners and nonprofit foundations —  gives community members equal access to a $3.5 million, 100-node supercomputing system at a fraction of the cost to build their own.

Tandy community supercomputerProviding access to high-performance computing is also a potential boon for economic development in Tulsa by helping to speed academic research and the business community’s time to market.

“We’re lucky to live in a community in which collaboration isn’t a bad word,” said David Greer, executive director of the Oklahoma Innovation Institute. “All the university presidents came together and said, ‘We want to support it.”

A Community’s Private Cloud

Five years ago the University of Tulsa engineering department set out to build a supercomputer for the school. They quickly learned that the cost of the infrastructure to house the machines alone would eat up 60 percent of the budget. The remaining 40 percent wasn’t nearly adequate to buy the computers they needed for the job, Greer said.

Talking with other researchers in Tulsa, Greer learned it was a common problem and together they hatched a plan to address it.

“We had this naïve concept that we could pool our money and build something we could all use,” said Greer.

The Tandy Supercomputing Center is essentially a private cloud built for an entire community. The center, housed at Tulsa City Hall, holds 100 nodes with 128 GB of RAM, comprised of two 2.7 GHz Intel Xeon CPUs each running Red Hat Linux for a total of 1600 cores with about 30 Teraflops at current capacity.

But when it comes to building a collaborative supercomputer, it turns out, the technology is the easiest part.

“Politics is the biggest barrier to a collaborative model like this,” said George Louthan, director of the Tandy Supercomputing Center. “I’ve never heard of George Louthan, director of the Tandy Supercomputing Center.three separate universities pooling together resources to buy something none of them own.”

A Model for Collaboration

The key to securing support for the project from the business community was finding a neutral location to house the computers, provided by the city of Tulsa, which allowed all parties to come to the project on equal footing, said Greer and Louthan.

They call the collaboration model “condominium computing,” in which the center owns the entire infrastructure and members pay the operating costs based on the number of nodes, or percent compute capacity, they need. Members are then guaranteed access to that dedicated resource as well as the ability to use more resources when they’re available.

The cost for members is a one-time fee of $10,000 per node plus $2,500 per node per year in maintenance fees. But several nodes are reserved for free public use through a grant application process to startups, nonprofits and other groups that cannot afford the initial cost.

Built at one-third capacity, the center is designed to scale as the community’s computing needs grow.  Empty racks are built and ready to go with power and networking capabilities when the need arises, up to 324 nodes.

Access to Technical Support

Tandy supercomputer top viewIn all, the compute resources members have access to aren’t much different from those available from any public cloud service such as Amazon’s EC2. The key advantage for businesses and researchers that buy into the Tandy center is the support they receive.

“For somebody who is familiar with scientific computing and capable of being a system administrator, it makes sense to spin up EC2 instances and then spin them down,” Louthan said.

But many of the center’s members don’t have much experience with parallel computing. And so the center has plans to employ three full-time support staff to maintain the infrastructure and work with members to plan and schedule their projects. They will also work as consultants, offering expertise and advice to businesses or researchers on how to most efficiently meet their data needs.

Center members also benefit from a water cooler effect, in which researchers and data scientists can find new ways to collaborate simply by having a common meeting place.

“It all boils down to the vision of these (university) presidents that what we have to do now as a community, as a country, is collaborate,” Louthan said. “It’s OK to compete on the athletic field; it’s not OK to compete in research and the classroom. Let’s work together.”