The airport said it had back-up generators that had restored power relatively quickly following the brief power cut.
However, the cut affected the security systems that communicate with UK Border Force, as well as the baggage systems, both of which are not designed to be turned off and take time to get back up and running.
Mr Woodroofe said the airport would investigate what had caused the fault.
The fault? Meh. Why isn’t there a UPS on the systems that take time to reboot?
Baggage system I can believe. I would not be surprised if they need to clear the physical bags out before it restarts.
Shirley everything else are just computers and routers and just reboot ?
No Otto, don’t be silly, you’ve seen systems in the past that screwed themselves because they were so badly written that they couldn’t reset and recover cleanly.
There’s going to be a lot of demand for UPS and power storage as the net zero madness gathers pace. A home diesel generator and log-burner will become must-haves.
Plus expect half of Brexit to disappear, starting with fishing, in 2026 when EU demands for allowing Britain to buy electricity at vast cost kick in.
Early this morning, imported electricity was providing nearly 17% of our power needs, slightly more than renewables. That’s higher than usual but not as high as it is going to be when Labour ‘decarbonises’ the grid.
Early this morning, imported electricity was providing nearly 17% of our power needs
It’s 22% now.
When they had that power cut in London a few years back, the trains stopped, and couldn’t be restarted until someone went out a with a laptop and rebooted them. It had never occurred to the train designer that there might be a power cut. D’oh!
Another issue with backup power (this wasn’t UPS as UPS is uninterruptable, i.e. continuous) is that lights and motors typically have far higher startup demands than when running. So you cannot restart everything as-is.
When we did the monthly test on the backup diesel generators at a site I worked in, you had to run around (in the dark, by torchlight) turning everything off first, to get the startup surge down. As everything came back on gradually, you turned more and more on until you were back to square oneski.
The neighbouring companies hated ithe noise and the fumes (clean running once going, but the startup smoke clouds were impressive…)
“A home diesel generator and log-burner will become must-haves.”
Tick, and Tick.
Our car is a diesel. Should we install a big diesel tank on our front lawn? Two birds with one stone.
But would the planning regs allow it? It’s not as if it’s easy to disguise a big tank.
There may well be a technical explanation but this is Manchester Airport so one can safely assume that the management doesn’t give a shit anyway.
TtC: It was worse than that. In those trains a previous software version did allow the driver to reboot the train but for some reason that option was removed in an update.
If we got another frequency wobble like we did then I wonder how much local wind & solar would now be forced off the grid by their mandated safety mechanisms. It was several hundred MW at the time.
Anyone remember that site where the diesel gennies on the roof got their fuel from a tank in the basement. Fed by electric pumps…
Disaster recovery was always a tick box which was hard to get anyone to take seriously.
“It had never occurred to the train designer that there might be a power cut”
Which also applies to the numpties at Openreach who are determined to push everyone onto fibre broadband & VOIP telephony. Virtually all of which are routed via street corner cabinets, powered by the local mains supply with no diesel gensets as backup. O.K. during short duration power outages, but in a longer cut – and once the back up batteries die – customers quickly loose ALL communication. And don’t think your ubiquitous mobile phone will help, because their base stations use exactly the same technology…
“Anyone remember that site where the diesel gennies on the roof got their fuel from a tank in the basement. Fed by electric pumps…”
Or a certain nuclear power station where the emergency diesel generators were below ground level, and a tsunami flooded them?
“And don’t think your ubiquitous mobile phone will help, because their base stations use exactly the same technology…”
Indeed. Most base stations have a very small battery to guard against power glitches & switching transients: minutes at best.
But with 20K+ base station sites, deploying power-cut incendiary bombs (3-hour batteries), or diesel generators and fuel farms would be impracticable. Think of all the ‘stop the mast’ protests!
And you’d have to use the diesel generators every week or two, to check they still work, and various types haven’t nicked them, or the fuel.
The first big power cut is going to be a real flight-of-fancy meets concrete-pavement-of-reality. The survivors will not be happy bunnies.
I have a nasty feeling that the firmware in my colour laser printer AND the boot drive in my PC were fucked over by a short-duration power glitch. Bastards.
@Marius
Early this morning, imported electricity was providing nearly 17% of our power needs,
… and early on Sunday morning it was 36.4%
Dave: A weather event the other year took out one of the LV circuits in our village to which our fibre cabinet is connected. We still had power and the battery in the cabinet lasted about 18 hours before it died. I was impressed as I expected it to die much earlier. I have a couple of UPSs on kit – one lets my server shut down gracefully. The other keeps the VDSL, firewall and one AP going for about 1/2 hour, enough to hopefully find out from UK Power Networks when we might be back on.
On the subject of mobile phone sites: in urban and suburban areas the cost of real estate alone precludes large battery banks and/or generators, however the overlapping of coverage means that you’d probably stand a chance of service unless it’s a very wide area outage and if that happens there’s bigger problems for the authorities to worry about.
In rural areas it really isn’t worth the cost.
Some critical EE sites will have backup for their emergency services contract! But don’t expect to get access to coverage from those sites in a hurry, unless something has changed in the past few years.
I keep seeing power company vans with stickers that say “Power cut? Dial 155”.
I always think “Good luck with that .”
@ Dave Ward “Which also applies to the numpties at Openreach who are determined to push everyone onto fibre broadband & VOIP telephony. Virtually all of which are routed via street corner cabinets, powered by the local mains supply with no diesel gensets as backup.”
Apart from (I think, prepared to be corrected) it’s BT, not BTOR, who want top turn off the POTS (plain old telephone service), the plan is for full fibre to everyone … eventually. That’s entirely passive from the exchange (or big green cabinet up the road from us in teh case of Fibrus) to the home. I expect the street cabinets used by alternative providers might well have the same problem, but services running over OR fibres should have the benefit of the sizeable batteries in the exchange which by then won’t be needing to keep all the POTS kit going.
But it is true, in the meantime, there will be “many” people on VDSL services which do rely on the local green cabinet to work.
Those stuck with an ADSL connection will still be OK as the ADSL kit lives in the exchange and runs off the exchange battery – so just a local UPS needed.
As to Manchester Airport …
A UPS to keep “everything” running would be commercially unviable – it would be MASSIVE. So what would normally be done would be to provide localised UPS for systems liable to damage or corruption if improperly shut down (i.e. typically computers and the like) – and these would be just to keep them going for a short time (in case the power comes back quickly) and then shut them down cleanly. All areas would have emergency lighting – either by individual batteries in selected luminaires, or in some cases, a centralised battery system just for the lighting.
But for most stuff, it’s a matter of having processes in place to bring them back up when the power comes back. And for something like the baggage system, that could mean waiting for the servers to come back, clearing everything that’s on all the conveyors (possibly by manually diverting flows as belts are started up in sequence), and then feeding stuff through again. That will take time to do – after the power comes back on (whether mains or genny).
Like most things there’s a tradeoff. If it’s an infrequent event, then you simply deal with it if it happens because it’s not worth spending much more than the events cost in (hopefully) mitigating them with massive UPS systems. I say hopefully as I think many of us know of situations when the backups didn’t work as intended, mostly because it wasn’t practical to properly test them.
Not sure I understand this. Large data centres don’t have this sort of problem. For obvious reasons, they DO have substantial instant generator kick-in capability (with whatever is needed for that blip interim transition).
“The airport said it had back-up generators that had restored power relatively quickly following the brief power cut.”
Hmmm… Cost? An airport going off-line/crashing perhaps isn’t that important. And maybe that’s simply true wrt frequency of the event?
“Large Data Centres” have two substations, on separate local feeders with no common path (said the salesman, anyway…). They also have a large battery bank to cover the gap before the (very large) diesel generators start, and a suitable fuel farm and leakage silo and fire-fighting kit. They will also have refueling positions (because under full load the fuel tanks will empty frightenly quicky) and mobile generators on a truck that can be brought into whatever site they are needed within a day.
This all needs to be tested regularly, and the batteries replaced every few years.
It all costs. A lot.
How much premium are you prepared to pay on you £25 GrovelJet ticket to fund all this, given that airport disruption is usually caused by strikes, snow, crashes, brylcreem bottles, strikes or strikes. When was the last power-cut induced one?
It’s the same reasons Stockholm, Montreal, Geneva and Moscow Airport have lots of snowploughs, and Heathrow doesn’t. Shouldn’t need explaining on this blog!
NB Large Data Centres do not however have anything in place for alien invasion, EMP or zombie hordes in search of Brians. Cost vs. realistic threat.
“many of us know of situations when the backups didn’t work as intended, mostly because it wasn’t practical to properly test them.”
Exactly.
There is no point spending anything at all upon backup systems unless you regularly use them ‘as it were real’.
False saving No.1 is a minimal-backup system of inferior service. It’s inferior, so you never dare use it for real, ‘cos it’s inferior. Hence it never works when needed. “Flat Spare-Tyre Syndrome”.
You need two fully capable systems, and you have a time-based routine for which one you will use today/this week/etc as ‘live’ and t’other being your backup.
And even then, once in a while, you come along without notice and pull the switch and watch what happens. It’s an education.
Or you buy a pale-blue flannel comfort blanket. It’s just as effective as an ‘economy’ backup and much much cheaper.