5 purposeful ways to tame 'accidental' data centers
Is that closet server sabotaging your building's best-laid energy goals? Here's how to keep that investment from skewing power consumption. Read More

Hear more about “accidental” data centers at the VERGE Salon NYC on Sept. 16.
The best-laid plans for green office buildings are meaningless if the occupants ignore them. Remember the flap last August over how much power the Platinum LEED Bank of America tower in New York was using? Suffice it to say, way more than anticipated.
Few things can skew energy usage more insidiously than the energy for running and cooling random computer servers, storage arrays and network gear shoved into closets or conference rooms originally intended for other purposes. Unfortunately, outside of big Fortune 500 companies that can for data center gurus, this practice is apparently pretty common: the Natural Resources Defense Council figures that at least half of U.S. servers are unmanaged, accounting for between 30 percent and 50 percent of all the electricity being used in small and midsize offices.
“Whether it’s accidental or not, it starts with one or two servers, which isn’t necessarily a big deal. But all of a sudden, you have a small server room that is far less efficient that what you might find in a big data center or colocation facility,” said Pierre Delforge, director for high-tech sector energy efficiency for NRDC.
For perspective, the amount of power we’re talking about is equivalent to approximately 20 500-megawatt, coal-fired power plants.
The fact that this is a surprise to many facilities or office managers is one reason it keeps happening. “Every company is vulnerable to this. Large companies usually have the resources to address the problem, but smaller ones might not have the wherewithal or the motivation to do so,” said Allison Bard, associate with sustainability business consulting firm Cadmus Group.
The reality is leases and space constraints might force small or midsize businesses to get creative about how they accommodate on-premises IT equipment. “Even a one-floor office can need a data center or computer room that requires support and cooling, backup power, the standard features you would expect in a big one,” said John Weale, associate with Integral Group.
The good news is it doesn’t take much effort to rectify the situation. Here are five best practices for businesses not quite big enough to have a dedicated data center, but growing fast enough to require sophisticated computing infrastructure.
1. Assess the magnitude of the issue
Cadmus classifies unplanned computing real estate in the following way: localized data centers with 1,000 square feet of “white space,” server rooms smaller than 500 square feet and server closets less than 200 square feet. The tipping point seems to be when a business is running more than 10 servers, according to experts. “These small spaces are pretty ubiquitous,” noted Robert Huang, senior associate with Cadmus.
Short of walking around to peek into all the nooks and crannies of an office space, analyzing submeter data can help your organization gauge if it has a problem. If the amount of electricity used outside business hours is almost the same as when employees are around, there’s cause for concern.
“If the difference is relatively low, it means that something pretty large is running 24×7,” Delforge said. That’s simply because unmanaged or outdated IT equipment uses roughly the same amount of power when it’s idle as when it’s fully used.
2. Invest in outside expertise
Most U.S. utilities have a vested interest in helping businesses reduce power consumption, and some even offer incentives for moving on-premises equipment to off-site data centers optimized for energy efficiency. Increasingly, they are focusing attention on smaller spaces, Huang said.
The challenge is assigning someone to be accountable. “Quite often, the person paying the bill isn’t accountable for this,” Bard said. “So, maybe you should hire an outside expert to help you understand the potential impact of certain actions.
3. Consolidate the equipment
Instead of buying a new server every time your organization adopts a new application, use virtualization technologies to add them to existing equipment.
According to the figures you cite, anywhere from 8 percent to 10 percent of the servers in use today are running for no apparent reason and it would be relatively easy to decommission them, or shut them off. Others could serve as the foundation layer for consolidation projects. “You don’t necessarily need to eliminate them, but you need to be aware of them,” Weale said.
4. Organize the space thoughtfully, and don’t fret so much about the heat
Instead of randomly sticking IT equipment based solely on where there’s space available, smaller organizations can address this issue more strategically by putting the gear in space where the plug loads can be managed centrally and where there’s already a concentration of heating, ventilation and air-conditioning equipment that can help control humidity and cooling.
Finding a way to bring in outside air for cooling, and organizing the servers so heat is dispersed more efficiently can help, Huang said. Speaking of heat, the latest ASHRAE standards allow temperatures of up to 80 degrees Fahrenheit, something many facilities managers don’t realize.
“Actually design a room to support the equipment you’re putting into it,” Weale said. Figure out what you need, and then design to support that without erring on the side of overkill.”
5. Consider the cloud instead
One option increasingly to smaller companies are cloud infrastructure-as-a-service or software-as-a-service offerings that allow them to adopt new applications or services without having to invest in on-site servers. Even though, however, this might mean supporting a deeper onsite investment in networking gear and switches to optimize connection speeds. “People are sometimes hesitant because of bandwidth, speed considerations or colocation costs, but it’s something more organizations should take time to consider,” Bard said.
Top image via Dell Tech Page One, credited to Spiceworks
