LATEST NEWS

DataBank Raises $456 Million in 4th Securitization in 3 Years. Read the press release.

It’s a Hot Market, but How Cool Are These Doors? DataBank Uses Unique Cooling Solution in Georgia Tech Data Center
  • DataBank
  • Resources
  • Blog
  • It’s a Hot Market, but How Cool Are These Doors? DataBank Uses Unique Cooling Solution in Georgia Tech Data Center

It’s a Hot Market, but How Cool Are These Doors? DataBank Uses Unique Cooling Solution in Georgia Tech Data Center


DataBank and Georgia Tech Progress with Purpose-Built Data Center in Tech Square 

The DataBank ATL1 data center is not your average data center. 

When Georgia Tech set out to create a high-performance computing center for their institutionthey turned to DataBank to build a data center environment capable of meeting the performance needs of the center, and overcoming the inherent challenges that often accompany HPC initiatives.

DataBank’s ATL1 facility will be located in the new CODA building of Georgia Tech’s Technology Square, smack dab in the middle of the Atlanta’s tech hub, where academia meets innovation, research, and fortune enterprises.  ATL1 will not only be different from any other DataBank facility, it will be  will be one of the most advanced data centers in the country 

Right out of the gate, ATL1 will possess two things unique to any data center in the region: 

1. Southern Crossroads (SoX) 

SoX serves as the Southeast connector to National Lambda Rail (NLR), Internet2 and other major U.S as well as International research networks. It connects southern schools, including Mississippi, Alabama, Georgia, Florida, and South & North Carolina. SoX is a special network fabric that privately interconnects many different schools and federal institutions.  

2. The Georgia Tech Supercomputer  

Georgia Tech was awarded $3.7 million from the National Science Foundation to cover 70% of the costs of a new, state-of-the-art highperformance computing resource for the CODA building’s data center, ATL1.The new HPC system will support data-driven research in astrophysics, computational biology, health sciences, computational chemistry, materials and manufacturing, and more. It will also be used to research energy efficiency and performance of HPC systems themselves. 

“ATL1 is an ecosystem. A social compute family where we’re bringing in any company that wants to be a part of it. Anyone who wants to witness a supercomputer and real innovation is welcome. 

Brandon PeccoraloGeneral Manager at DataBank 

The Green Initiative and a New Generation of Data Center Cooling 

But, there’s something else that makes ATL1 quite different: its cooling technique. 

For the past thirty years, the data center industry has largely relied upon the same cooling techniques and architectures and, with minor exceptions, cooling has not been a subject of great innovation. In ATL1, DataBank, pushes the envelope with its new ColdLogik Dx Rear Door cooling solution from QCooling.  

Data centers focus tremendous amounts of time seeking the most efficient use of power. The biggest factor in power consumption is one that data centers have little control over: a customers’ hardware. But the second biggest factor in power consumption is PUE – an industry-standard measurement of how efficiently data center cools the waste energy coming off computers. This is an area data centers have lots of room to innovate and find improvements.

Traditionally, this has meant taking energy from the utility, and running AC units within the data center as efficiently as possible to cool the overall environment. When you stop and think about it, this approach is comically inefficient.  You are basically taking the heat that is generated from consumption of energy (in the form of computers) and consuming even more energy to cool it (in the form of AC).  

QCooling’s solution for ATL1 adds a unique approach. Waste energy that comes out of a server in a cabinet is exchanged by heating naturally cold water that runs through a closed-loop system in each cabinet rear door. The water is heated, and then rejected out of the building as new naturally cold water is cycled back in to replace it keeping the cabinet continuously cool.   

That system alone would represent a major innovation in data center cooling  But, in our mission to continuously evolve the data center experience, DataBank took the implementation one step further.   

In the typical data center environment, any heat removed with CRAC units is usually sent back to the central plant, exchanged for condenser water, and receded off the roof via typical heat rejection. Instead of wasting the heat, the ATL1 facility is actually sending it over to the CODA building’s high-rise boilers and allowing tenants to reuse and repurpose it to heat their offices in colder weather, thus further offsetting energy use.  

Using QCooling’s ColdLogik rear door coolers, DataBank and Georgia Tech are cooling 50 kW per enclosure, per rack, using 73-degree warm waterAnd this rear door cooling system can use that same capacity to cool up to 100 kW per rack with just minor changes in infrastructure.  

  • Cooling 50 kW with 73 degrees, with the potential to double 
  • 90% less energy consumption 
  • 80% more real estate than traditional cooling 

Cool, right? 

“Being able to cool 50 kW with 73 degree, and able to go 100 kW in the same footprint? That’s unheard of.” 

-Neal Bryant, Facilities Manager, DataBank

Share Article



Categories

Discover the DataBank Difference

Discover the DataBank Difference

Explore the eight critical factors that define our Data Center Evolved approach and set us apart from other providers.
Download Now
Get Started

Get Started

Discover the DataBank Difference today:
Hybrid infrastructure solutions with boundless edge reach and a human touch.

Get A Quote

Request a Quote

Tell us about your infrastructure requirements and how to reach you, and one of the team members will be in touch.

Schedule a Tour

Tour Our Facilities

Let us know which data center you’d like to visit and how to reach you, and one of the team members will be in touch shortly.