Is your web host’s data center prime for disaster recovery? A “disaster” can mean a myriad of things, and some data center operators never saw these whoppers coming. You know the basics like floods, fires and maybe a tornado, hurricane or earthquake if a data center is located in a region where that’s a possibility. But what about a wayward vehicle crashing into the data center? It’s happened, even though the sheer, relative lack of data centers in North America compared to other types of buildings make this scenario improbable. There are just over 1,000 data centers in the US, many of them serving web hosts, and there’s at least one confirmed vehicular crash that caused a major disaster.
When Jonathan Bryce founded Mosso Cloud in the Dallas area, he was prepping for the winter holidays when disaster struck. It was December 18, 2009 and a diabetic driver close to the Rackspace data center fell unconscious at the wheel and drove straight into the facility that held the electricity transformer hardware. Was Bryce prepared? “It’s just one of those things you have to cope with as best you can,” he says.
Here are a few more unexpected “surprises” that data center managers find tough to see coming:
A Fire Nobody Expected: A fire in a data center is a reasonable concern. However, sometimes it can blindside you. In February 2014, Iowa’s main data center experienced an electrical fire. Technically, this possibility was in their disaster plan—but given the rough winter, everyone was armed and ready for blizzard disaster recovery. The CIO for Iowa, Robert von Wolffradt, blogged on the GovTech.com website that when power went out and smoke drifted in at 3pm, the alarm set off a major fire suppression tactic.
Ultimately, the fire was contained in a small box, but it took many hours to fully restore power. It took almost four hours for staffers to be allowed back inside, and Wolffradt was in the unfortunate position of deciding the payday move. There wasn’t much time left to make the requisite $162 million in payments to vendors, employees and citizens. Everything was up and running by 9pm, but turning equipment back on was a big risk given the overheated, melted and destroyed equipment. A backup data center was secured temporarily just in case, and payments were posted by 11pm.
He was also charged with restoring service desk systems, Department of Transportation monitoring cameras to watch for changes in the snowstorm, virtualized applications and financial systems. Wolffradt explains, “We leveraged out Homeland Security’s voice notification system to update agency directors and key staff twice during the event.”
When Samsung Went Up in Flames: The South Korean facility experienced a fire like no other in April 2014. The source? Samsung’s SDS data center, and staffer Jaehwan Cho captured images that quickly went up on his personal Twitter. The intense heat had pieces of the building falling, but luckily nobody was seriously hurt. It wasn’t such an unusual fire, except that it ultimately impacted Samsung users. There were several accounts of tablets, smart TVs and smartphones losing data permanently. Samsung users couldn’t access content for many hours, which led Samsung to formally apologize and possibly lose loyal customers for good.
If It Walks Like a Duct…: Fireworks began early on July 3, 2009 at Seattle’s Fisher Plaza electrical vault. When the fire broke out in the vault, it took down Dotster’s domain registration service, Microsoft’s Bing Travel feature, Authorize.net payments, web host AdHost and a slew of other clients. It took a full day to restore power. According to Puget Sound Business Journal, both AdHost and Geocaching were online by 10am the next day, but many other clients had to wait it out even longer—which was detrimental to some given the holidays. Fisher Communications paid $10 million to repair and replace equipment, but the damage to their relationship with clients was permanent in some cases.
The Sandy Damage Nobody Talks About: It’s no surprise that Hurricane Sandy impacted data centers. One of the less talked about victims is Peer 1 Hosting, located in Lower Manhattan. The center was prepared for disasters, but not of this magnitude. Backup generators were aplenty, perched above the 18th floor, but they needed the emergency generator fuel pumping system that was in the basement in order to work (it seems nobody thought that flood plan through). When that system is underwater the electrical circuits can’t function, and thanks to the post-9/11 New York rule only a minimal amount of fuel is allowed to be stored in commercial buildings.
With no oil available, Peer 1 Hosting had to shut down all systems. Employees worked diligently to save data, and a bucket brigade was created to hand-carry fuel to each of the generators. Step by step, fuel was delivered to the 17th floor and placed in generator tanks. This effort was good enough to allow clients not to suffer long-term impacts. It took 25 people carrying fuel up stairs from October 30 through the 31st to succeed, which was a Halloween scare nobody expected.
However, by noon on October 31 the tank was full and a break could be taken. The lunch, of course, was also hand delivered from across the Brooklyn Bridge. A bucket brigade wasn’t on Peer 1’s disaster recovery strategy, but who knows? It may be today, and Peer 1 is one of the few companies that wasn’t shut down by Sandy.