Recently, DataBank announced the development of a new 94,000 sq. ft. data center in downtown Atlanta for anchor-tenant, Georgia Institute of Technology. Located in Midtown Atlanta and codenamed “CODA”, DataBank’s ATL1 data center will serve as a high performance computing center, set to house the Southern Crossroads and provide high speed, high bandwidth connectivity to research and education sites throughout the southeast and across the nation. Georgia Tech’s premier academic and research programs will be the main tenant under a long-term lease for both the data center as well as the adjoining office tower.
The HPCC project at CODA is being spearheaded by Georgia Tech to create public-private collaboration leveraging Georgia Tech students, faculty and knowledge to solve real-world business problems. Researchers and industry participants working in the adjacent office tower will have direct, high-speed fiber access to Georgia Tech’s latest fleet of supercomputers and unique datasets located inside the HPCC, as well as the on-site expertise of Georgia Tech’s academic community.
When Georgia Tech set off on their endeavor to create an HPCC for their institution and approached DataBank, our team joined their ranks to create a data center environment specifically designed to accommodate their needs and resolve the challenges that often accompany HPC initiatives.
When it comes to higher education, particularly engineering and scientific research, the quest for information is unceasing. As technology advances, more extensive, complex questions surface, and the only way to analyze and answer those questions is with HPC systems (CIO from IDG).
In higher education (and virtually every other market, as technology continues to advance), staggering amounts of data are being gathered and stored. When it comes to researchers working within universities, billions of files and petabytes of data are the kind of volumes generated and dealt with on a daily basis. This data often represents the progress of incredibly important initiatives: climate change, cancer research and more.
Obviously, HPC is called high-performance computing for a reason: it requires extraordinary performance. Higher education institutions like Georgia Tech need compute environments with a tremendous amount of computing power, capacity, and storage. It’s also worth noting that while reliability is important, uptime isn’t necessarily the primary goal; sheer power is the resource necessary for extremely large data sets being generated by intensive research.
This necessitates very specific needs for proper heating, cooling, space and managing massive workloads, which is why such an endeavor creates the need to work with an advanced data center provider well-versed in the IT challenges of HPC.
When design and development were underway, our engineering team worked with Georgia Tech to carefully define their specific requirements. This particular solution was engineered very precisely around their performance needs and involved multiple levels of differing designs.
One section of the HPCC doesn’t require significant UPS backup power—the platform can be likened to blockchain, where data is fed into it and if an incident occurs, the batch job is simply restarted. UPS units with flywheel will be installed to provide thirty seconds of reserve time, which will accommodate for any quick power hits from the utility company.
Another section, referred to as the OIT (office of information technology), is where the data fed into the HPCC is held; SAN equipment, for instance. Such data requires an additional backup capability, so we’ll install a UPS with full battery and generator backup. Ultimately, the OIT section of the HPCC will have full redundancy.
Where cooling is concerned, the HPCC has unique requirements in that it runs somewhere between 40-60 kW per rack. Consequently, the best option for meeting cooling demands is cold water back doors. This consists of an AC coil on the back door of racks, which cools the heat generated by the high density as it comes off the back of the server, then exhausting it into the data center floor. The OIT section requires more standard data center cooling with perimeter CRAC units. This dual cooling approach requires two different temperatures of water being supplied to the data hall. The CRAC units will be fed with 52-degree water and the cold water back doors will be fed with around 60-70 degree water; anything colder will cause the coils to condensate.
“High-performance computing has very different, very specialized environmental design demands in comparison to your typical data center. Unmistakably, Georgia Tech called for a very specialized approach to engineering. Fortunately, DataBank was up to the challenge.”
-Danny Allen, VP of Technical Operations at DataBank
At present, the HPCC-focused data center in Atlanta is still in the works, though the shell of the building is complete. The interior buildout is in process, including chilled water piping, setting up UPS equipment and electrical gear and tying everything together. The data center is expected to be online Q1 of 2019.
In the planning and development stages, the facility was laid out for Georgia Tech specifically, with the second and third floor functioning as colocation floors. This provides accommodations for environments intended to be connected to Georgia Tech and the HPC platform as typical colocation customers.
As can be demonstrated by Georgia Tech, the demands of HPC environments are extensive and very specific with respect to design requirements. If you’re part of an institution heavily engaged in scientific and engineering research, then you’re probably familiar with the IT challenges that accompany HPC environments. It’s important to empower your institution and its educators in their search for knowledge, but it’s equally important to protect the fruits of their labor. You can solve this problem by taking advantage of a data center that will give you the space, power, and cooling you really need to use HPC effectively.
DataBank specializes in providing such environments and the managed services that complement them so that your university doesn’t have to lay out excessive capital for IT initiatives that would be better spent elsewhere. If you’re seeking to identify the best way to protect your HPC technology and support your researchers, get in touch with DataBank today, or call us at 1.800.840.7533 and speak to an expert.
Discover the DataBank Difference today:
Hybrid infrastructure solutions with boundless edge reach and a human touch.