"Snowtober” on the East Coast has raised many questions about the managerial reliability of several major utilities. It is overwhelmingly evident that these power companies were ill-prepared for snowfalls ranging from eight to more than 12 inches of heavy wet snow so early in the season that trees still had almost all of their leaves. However, it is the exceptionally poor response to the disaster that is most disconcerting.

The storm struck a large swath of the northeast on Saturday, October 29, 2011, and I’m still without power a week later. Writing this column gives me reason to look back and reflect not just on how unprepared we all were, but how much worse the response to the disaster has been.


I have to chastise my local utility. I routinely lose power monthly (sometimes weekly) in sufficient duration to require resetting all of our electronic clocks. Because of this, I openly state that I do not have a very high opinion of the utility to begin with.

A review of the events relating to the October snowstorm highlights many of the errors in the utility’s response.

The storm was first forecast on Thursday, October 27th. From everything I have been able to learn, the local utility waited until Sunday morning after the storm had passed, and the massive destruction had occurred, to call other utilities for help.

Late Sunday afternoon, I passed the first truck caravans of crews driving up I-81 in West Virginia, but I thought it strange that we passed only four utility line repair trucks and over 50 independent contractor tree trimming trucks. At the time, I could not help but think that if so many trees were down, where were the crews to repair the lines? I am still wondering.

Over the next three days there was virtually no data coming from the utility, only radio news reports. Worse yet, not a single utility truck was seen in or around our town until late Thursday (day four), but it was so late in the day that work did not start in earnest until Friday. Beginning Wednesday, the utility’s PR spokesmen started spinning the disaster.

Their daily announcements on local morning radio went something like this:

Wednesday morning:“All customers will be restored by midnight Thursday”

Thursday morning:“95 percent of our customers will be restored by midnight Thursday and the remaining 5 percent by midnight Friday”

Friday morning:“Our remaining customers will be restored by midnight Sunday.”


Wouldn’t it be great if we could just make up schedules in our data center world? Then when it turns out to be wrong, and every revision is wrong, there would be no repercussions. Our lives are not like that. We have to have integrity with our customers. Our world is one where honesty matters; unfortunately the utility does not bear the same burden.

The outage reporting system was another fiasco. Obviously, I expected long waits to get a person. What I didn’t expect that waits would be generally less than 10 min, and that the staff answering the phones so quickly would know absolutely nothing and have no resources to get any information. From what we could tell, these agents seemed to be taking written notes but not inputting information into any kind of computer system. Hopefully that was not the case. Most of the call center operators seemed to want to provide information; however, at least two were downright nasty.

Interestingly, around 11:00 p.m. on Thursday, the operator quoted five different numbers on how many people were still without power in our area in the course of 5 min. We were told everything from 300 to 25,000 remained without power (including us). Obviously the utility was totally clueless.

So clueless, in fact, that the utility started calling customers to find out if their power was restored. The problem here was that it was an automated system: Press 1 for YES and 2 for NO.

The way the survey was worded led people to think that power was restored so they checked out of their hotels and drove home only to find power still out and then could not get back into the hotel.

This was just another good idea gone awry. If they had just added a third option, “Press 3 for I don’t know,” much of the confusion could have been avoided. Only when Governor Christie intervened did the utility stop the surveys.

In addition, people who reported as being out were mysteriously being relisted as having power restored. Several of our neighbors experienced this, so we made it a point to call at least once a day to keep our loss of power established in the system.

Perhaps the most ridiculous thing we heard was the utility’s PR suggestion to get updated information on the website. Hey Mr. Utility, wake-up! No power means no cable and no internet for most of the world, especially after smart phone batteries went dead.

Once again we see a critical supplier rely on the internet as the panacea to answer questions.

What were they thinking?

Probably the strangest thing we heard all week was the message from the head of the utility telling us how hard 4,000 of the utility’s employees were working 24x7 to restore power to over 360,000 customers. Let’s do a little math here. That meant that if each employee was working as a lineman, he would have to individually restore power to 90 customers. But 10 percent of those workers were management (pencil pushers), 15 percent were call center employees (mentally the toughest job), and maybe another 10 percent were tree trimmer contractors. That leaves about 2,600 workers actually making repairs, which represents a ration of 138 customers for every worker. Obviously the head of the utility was just throwing out numbers to impress himself without thinking about reality.

The best example of their ineptness was Sunday morning (a week later). Power was restored the previous evening about 8:30 p.m. The family rejoiced for having heat, lights, and well water back; however, at 9:55 a.m. the following morning, power went out again.

After holding for up to 40 min to report the new outage, an operator argued with me that that the original ticket was still open (it should have been closed last night). Finally she agreed to issue a new ticket and have a supervisor call me (a 4-hr wait). Two hours later, an out-of-state utility truck pulled in my driveway and started following the lines back. I caught up to him a block over where he stopped to chat with a crew working on another line.

I found out that he had to start with my location because he had no drawings or maps of the electrical distribution system. He was effectively working blind. He was a nice guy, but gave me the usual line that another crew was ”probably working on my line upstream of me.” Like everyone else we talked to last week, he really did not know.

In summary, the utility lied about their stats, provided non-information, missed the restoration of service schedule by days, and did not give their front-line people the tools they need to work effectively. In short, the utility is so poorly managed that their franchise needs to be revoked and replaced with a franchise that cares about delivering reliable service.


The utility, however, is only part of the picture. As users, we have a responsibility to protect our own assets during. I learned some lessons in this regard.

For over 30 years, I have been involved in data center reliability and fostering continuous operations no matter what the disaster; however, my home preparations would remind anyone of the metaphorical “shoemakers’ kids.” You know, the ones who went shoeless.

In looking back, all the routine outages we experienced should have tipped me off that the utility service was weak. Instead, like most people, I bought a few consumer UPS systems to protect the family’s electronics (those loss of power beeps can drive you crazy) and called it a day. In 30 years, the longest we were without power was for 14 hrs in the dead of winter and survived.

So how prepared were we? Not that great. We are in a semi-rural area with a well for water, a fireplace, an oil-fired boiler, hot water coils in the air handlers, cable, and no cell coverage without the extender that runs off the cable. For 15 years, we talked about getting a generator to run the house (half my neighbors have them) but always thought that we would be moving south before we would need one. Well this storm blew that idea out the window.

By Tuesday, I started looking for a generator. Alas, they were all long gone. Fortunately, I had good resources, and my boss helped me locate a 6.5 kilowatt model at a nearby out of state location. Unfortunately for me, every retailer and industrial supply store I stopped at were sold out of the 30 amp L14 plug I needed to hook the generator up to the house.

I thought better of hotwiring the generator. A day later after, another trip out of state, and I had the plug. Fortunately, I also picked up a good voltage tester because I miswired 240 V to the plug’s neutral. Without the voltage tester, and a desire for safety, I could have done some real damage. Finally I got it right, the generator switched in and fed power to half the house, including the well and the furnace.

It was the evening of day four, and we were jumping for joy just to have water, hot water, and some lights.

The generator just could not handle the motor load of the air handler. Fortunately, the weather remained mild. Last, the cable company was out of business without power so we remained without cable, which also meant no cell phone coverage as well.

So, this experience has added to the “honey-do list.” First get a licensed electrician to install a fully rated transfer switch and quick connect for the portable generator. Second is to buy a generator to run the full house. Third is to convert the fireplace to gas. Fourth is to go back to free television over the airwaves and fifth, to start thinking about what other disaster can befell us and prepare.

How much can I get done before the next disaster?