Behind the Scenes – Data Centres

  • A complex business
  • Vitally important architecture
  • NOT terribly sexy

The infrastructure of the internet is poorly understood by most. Which is a shame, because a) the infrastructure of the net is fascinating, and b) it’s relatively simple and straightforward. The most fundamental way to understand it is that it works pretty much the same as how you wire up your home office. It’s wires, carrying electricity, from a computer of some type, to a modem. The modem then connects to a power outlet. Really, the public internet infrastructure works surprisingly similar to that. It’s just much bigger.

A hard drive inside your computer is a reservoir to store your data, and then you access it when you need to. It’s plumbing, basically. Out in the larger world of the Internet, these are called Data Centres. They’re the building-sized hard drives of the World Wide Web. This guide applies to data centres used by medium to large businesses – your banks, insurers, large retailers, etc. Companies who use the web to reach customers, without necessarily being ‘web-based’ services themselves.

Moo

server farm

Not all Data Centres are built the same, but they adhere to some standard designs. There will generally be a room full of servers (often called a farm), used to receive, store and send data as needed. The servers are connected to each other using fibre optic leads, which have become cheap enough to deploy even over these small distances. In the past, Ethernet cables were used - like the thick blue and yellow cables used to connect a PC to a regular modem.

The server farm is usually in a very brightly lit room, separated into Hot and Cold zones. The HVAC (Heating, Ventilation and Air Conditioning) layout of a server farm is complex and serious business. Temperatures have to be cool enough to prevent overheating equipment, but humid enough to prevent the same machines from physically cracking.

The servers are about 7 feet tall and lined up in rows, with ventilation provided from below. The rows at the back will generally be where all the heat is generated, so these are cooled with high powered refrigerated air, about 18 degrees Celsius. 25 degree air is then pumped up at the front of the racks. Newer systems aim for greater power efficiency, and the ability to operate at very high temperatures without failing. It takes much more energy to cool than it does to heat.

The server farm, like other parts of a data centre, is protected by a wide range of fire suppression technologies. General water sprinklers are still used, though this is on the way out. Usually they’re charged with air first, in what’s known as a Pre-Action System. If a detector picks up heat or smoke, it still has a sequence where it charges with water before further deploying, affording the operations team time call it off. Water damage to hardware makes data recovery almost impossible – in fact, fire damage is less of an issue when it comes to data recovery.

Many farms use gas suppression, which is fine if there are no humans in the room. Using argonite, carbon dioxide or Inergen, these agents act on the principle of depriving a fire of oxygen – excellent for stopping a fire in it’s tracks with no damage to hardware, very bad for things that otherwise need oxygen.

Portable extinguishers are generally filled with chemical foam, the most common and effective means for spot controlling a fire.

Flying the flag for Aussie innovation, most data centres around the world employ a Very Early Smoke Detection Apparatus (VESDA) system, a highly sensitive smoke detection system developed at the CSIRO. It’s not recommended to fart in a server farm.

CRACajack

crac unit

Oh, more foreboding giant black rectangles. What a surprise.

All of this heating and cooling is provided by Computer Room Air Conditioning (CRAC) Units, enormous, high energy conditioners usually kept in a different room all by themselves, usually operating off of a completely different plant room to the rest of a building. In fact, most parts of a data centre operate off of independent plant from say, the lighting in the offices around it.

CRAC units are built by specialist manufacturers, generally dedicated to just that endeavor. They’re not terribly exciting, but they’re extremely important to the day to day operations of the centre.

Bzzzzt

generator

All of this is powered by mains power, but no centre relies just on the utility companies. Bad weather can knock out a substation- that doesn’t mean it has to knock out your business. That’s where the generator comes in.

Big data centre generators are usually housed within 200 metres of the farm, and carry several thousand litres of diesel fuel. They’re not well advertised – their location is generally on a need-to-know basis. Why point out where thousands of litres of flammable fuel is kept under high pressure? (hint: ALL over the place. You’d be surprised how many innocuous office buildings have data centres in them).

Generators usually carry enough fuel to run your operation for 24 hours, which you'd hope is enough time for the utility supplier to get their act together. They are LOUD. They’re wired to all sorts of building automation systems, which are programmed to alert bleary eyed facilities managers at 4am when they’ve been deployed.

Hummmmmmm

  ups

Batteries. Nature's quitters.

If your mains power cuts out, it still takes a few seconds for the generator to take on a power load. In that few seconds, your systems would just shut off, and BOOM goes several thousand purchases, downloads and inter-company emails. NOT. GOOD.

So in between the mains and the generators is an Uninterrupted Power Supply room. The UPS is the least glamorous of all the elements of the data centre. It’s a bunch of batteries.

And we don’t mean that as a means of exaggeration- a decent sized data centre will literally have a room full of car batteries, in racks, all wired to the system and storing enough electricity to run a data centre for about 30 minutes on its own. This should give you some idea of just how little energy there is in a battery compared to say, diesel fuel or mains power (which is mostly driven by coal in Australia).

But even if mains cut out suddenly, batteries can’t unload instantaneously. Actually, the UPS is always on, serving a dual function. All the power that comes into the building goes through the UPS first, so really the UPS “serves” power to the data centre – it’s just being constantly fed. As soon as it detects no feed, it alarms – as if to say “GENERATOR, QUICK, GRAB THIS, I CAN’T HOLD IT BY MYSELF”. The generator throws away it’s sandwich and jumps into action.

     defibrilator

      Dammit UPS! Don't you quit on me!

The UPS serves to also condition the power coming into the building. If  we imagine the power coming in like mains water, then the UPS is like a tap which can set the flow the way you need it to. Complex built environments like this don't appreciate spikes in the flow of energy, so the UPS keeps it at a nice, steady stream. This is a big version of surge protection, like the type provided by the more expensive power strips for your home.

And the rest

        facilities team

          "If we take out this wall, we can probably fit in more big black rectangles"


Of course, all of this hardware is manned, maintained, checked and re-checked by a large team of operations staff, going 24 hours a day. This is an industry that supports engineers, network technicians, fire techs, telecoms techs, plumbers, electricians, mechanics, steamfitters and more. Even carpenters are employed to build access hatches to roofs and floors and maintain fire rated doors. Glaziers are called to build hot and cold cells within the server room. Specialty cleaners are employed to keep the systems free of harmful substances. It runs on a 52 week constant maintenance schedule, and must adhere to a battery of legislative guidelines and ratings for energy efficiency, systems redundancy and hazardous material handling. Most energy around data centre management is given towards performance, but energy efficiency is a growing focus.

It’s a quiet, contemplative life, and the clothing tends to be pretty casual. It’s Nerdvana.