The Internet is a disparate, interconnected sprawl of otherwise unrelated networks. But have you ever taken a moment to consider where the machines that serve the Web pages and handle your email actually reside? High specification buildings called data centers house most of today's business-critical Internet. And the technology is like something out of a futuristic blockbuster movie.
It wasn't always so. In the early days, the Internet was nothing more than a few organizations with interconnected networks, and most of its associated equipment was kept in relatively low specification server rooms. That equipment was generally compromised of a few servers, a router and some relatively basic networking equipment. These days, however, even the smallest businesses can afford to keep their servers in the highest specification environments with features that previously only military installations or multinationals could boast. In other words, data centers.
So what is a data center? And what does it take to keep one running? Some of the answers to these questions may surprise you.
5 Factors of Data Center Uptime
A data center is all about one thing: uptime. Clearly, being able to process mammoth amounts of data is of no use if that data is unavailable to those who need it. There are several critical cogs involved in ensuring these systems run like clockwork. Essentially, these measures turn what seems like mission impossible in mission accomplished.
Picture the scene. At the side of a street there's an innocuous, unmarked warehouse sitting among many other large, white-sided warehouses. Like all the other buildings in the industrial zone there's an abundance of air conditioning vents protruding from the side of the building near the roof. Otherwise, this building might be storing planks of wood for all you can tell from the outside. Take a slightly closer look and you might see a few more security cameras dotted around its perimeter, some with infrared attachments. To a potential miscreant with nefarious intent, this building probably holds no interest whatsoever.
Step up to the front entrance, in full view of several day and night cameras, and you might be surprised to spot swipe card reader units. One further step inside the entrance would reveal retina scanners and hand-sized fingerprint scanning devices.
Even if you were somehow allowed over the threshold by security staff, there are very strict access controls here - both seen and unseen - as soon as you set foot though the outer perimeter. Even many of the site's authorized staff are limited to certain areas, and a variety of procedures split the building into very distinct sectors. Meanwhile, 24/7 security personnel strictly limit access to any part of the facilities to all but a select few.
The internal areas of the building are also sectioned off in case of physical attacks on the structure of the building itself, such as the roof. Noisy vibration-sensing alarm systems alert staff to such break in attempts.
What exactly is under protection here? A data center, of course, literally a warehouse stocked with the high-powered computer systems, storage and related components upon which many companies and individuals - not to mention the Internet as a whole - rely on for computing power.
With this level of sophistication, you'd almost expect to see Tom Cruise making an entrance through a hole in the ceiling! But all this security is hardly overkill. In fact, it's not uncommon for prospective thieves to drive vehicles into the external walls of data centers in an attempt to make off with expensive server equipment.
The vast majority of data centers connect out to the Internet in one form or another (whether privately or publicly), so there's little point in their existence if external connectivity isn't top notch.
When it comes to the external connectivity used by data centers, several business models have been adopted, and these usually depend on the type of business that operates the data center. A common selling point is one of carrier neutrality. In other words, these data centers tend to house many different businesses of all sizes, each of which is offered a choice of bandwidth provider within the data center environment.
If one enterprise owns a data center outright and doesn't follow the co-location model, which is one of sharing the location with other businesses, external connectivity providers may be chosen deliberately, rather being left up to the businesses using the data center. These choices might also be replicated across other data centers owned by the same enterprise.
A network operations center (NOC) will also exist in one form or another depending on the services fulfilled by the data center. Think of a room filled with bright, blinking and constantly updating screens staffed by engineers squinting relentlessly at them looking for any anomalous data event that might cause a problem.
There's even something happening under the floor: Due to the network cabling that may reach hundreds of miles, underfloor cables are meticulously labeled and documented within a engineering database. Sometimes data signals are amplified using switching equipment to send data the considerable distances around the facilities successfully.
How the external connectivity enters the building also needs to be carefully considered. Multiple external fiber duct points are usually designated to fiber providers. Once fiber is installed, the carriers can provide their bandwidth on top of the fiber entering the building via more than one point so that a problem on one side of building is less likely to cause a service outage everywhere.
If you understand how much the equipment in a data center is worth and how critical it is to the businesses that rely on it, the ultra-tight security measures make perfect sense. But there's another critical aspect to any data center that's less obvious: cooling systems.
The thousands of servers running within a data center generate a significant amount of heat. That makes HVAC (heating, ventilation and air conditioning) an essential system in keeping data centers running. If equipment gets hot or is subject to any rapid temperature change, it can fail without warning. It should go without saying that these complex environmental controls are of paramount importance to servers and their associated systems. That's why dry eyes and throat are an occupational hazard for any onsite engineer!
4. Fire Protection
With millions of dollars of equipment in one building, data centers take serious precautions to prevent fire. That means strict protocols and serious consequences for anyone who breaks them, such as by leaving a cardboard box in a server room.
Fire walls are another measure here, and we're not talking about the fancy firewall box that sits within the racks and protects the data center from hackers. Nope, we're talking walls for fire here, the exceptionally thick, purpose-built walls that physically separate different facilities rooms (one room alone might host 10,000 servers). In the event of a fire in one room, the equipment in another should stay safe for a guaranteed number of hours, even if its surrounding walls are subjected to unfathomable levels of heat from a raging fire. Cool, huh?
5. Power Continuity
Clearly there's a lot more to the innards of that innocuous-looking building sitting in the middle of the otherwise gray and insipid industrial area. One key aspect that must be mentioned is power, because not much happens in a data center if there's no electricity available.
Maintaining power is no small task, as brownouts and blackouts are common across both business and residential zones in many of the world's major cities. That's why data centers employ several methods to try to ensure a continual power supply, no matter what.
First, the utility power from national power grids will be fed into the building via multiple points from separate substations. This provides separate power supplies if one is cut off. Then, an uninterruptible power supply (UPS) ensures that power can be seamlessly switched to a secondary supply if the primary one fails.
That secondary power supply usually comes from refillable diesel generators. You may be able to top them up with diesel if there's a prolonged power outage, but the downside is that generators tend to be needed in an urgent situation, and it takes a few minutes for generators to get going and actually pump out power. The UPS, therefore, might need an entire basement full of batteries, similar to those you would find under the hood of a car, to power the affected facilities room for a whole five minutes until the generator is ready. Once the generator is up and running, the batteries slowly and dutifully begin recharging so that they'll be ready for the next outage or frequently scheduled maintenance test.
Data Center Design
With a firm footing in both military and traditional telecom backgrounds, data center design is not for the faint hearted. In fact, unless you've worked in a data center, you probably had no idea there was so much going on inside. All that complexity is a good thing. It means that today's Internet is backed by some pretty serious technology, and for good reason: Vast amounts of money are at stake if a data center fails to fulfill its business. You might say that it's as much mission impossible as it is mission critical to keep a data center running day and night. Fortunately, technology usually saves the day.
Survey: Why Is There Still a Gender Gap in Tech?
Do you work in the tech industry? Help us learn more about why the gender gap still exists in tech by taking this quick survey! Survey respondents will also be entered to win a $100 Amazon Gift Card!