Blinding Glimpse of the Obvious (1 Viewer)

DickyP

Active member
Local time
Today, 16:02
Joined
Apr 2, 2024
Messages
485
Quote from today's papers.

"Cyber attack contingency plans should be put on paper, firms told"​

Can anyone think of a reason why they shouldn't be doing this already?
 
Somewhere along the line, hard-copy contingencies became obsolete.
Yeah, perhaps I should have phrased the questions as "Can anyone think of a GOOD reason why they shouldn't be doing this already?"
 
Agreed. That is what I was trying to imply, hard to convey sarcasm in text!
 
Back in the 2000-2009 decade, the U.S. Navy started their Continuity of Operations plan which included building an alternate operations site, beefing up the network long-haul segments, setting up data replication to the alternate site, and arranging for alternate site consoles to be visible on an "inside" network so that both primary and secondary sites could be controlled from the same place. Of course, the contingency plans WERE formulated in Word and Excel. We had a Lt. Commander in charge of that project. He worked from paper only in the weekly status meetings.

MANY years ago, in my first "real" job, a co-worker named Wayne gave me a little bit of wisdom. "Paper has a better memory than people.." I took that to heart while I was still in the petroleum industry.

Our design guide was always on paper and organized so that each major component could be updated easily without having to reprint the whole document. So... 3-ring binders and 3-hole pre-punched paper, plus card-stock separators with color-coded tabs between each chapter. However, once I moved to the Navy job, their rules were different and EVERYTHING was electronically transferred. "When in Rome, ...."
 
I remember a stint at the phone company back in the 90's. Every week they printed a hard copy list of all active customers. It took 3 boxes of greenbar.
 
Major cost savings in terms of the machines, I used to work for a dept in wells fargo auto finance called Document Management. And wow, the level of sophistication of the machines that were used for mass printing, mass copying, mass scanning -a lot of that was let go with the digital age. Those machines were SO troublesome, you could have the most expensive machine in the world but the "scanner guy" still had to come out every week and sit there for hours troubleshooting, fixing, billing...
 
In the 1960s, my mother worked for Southern Bell (yes, that was the name back then, not Bell South as it is now.) Their equipment included some incredible machines for creating metal plates for bill addressing - as in, align the plates, press down, type a name, address, city, state, etc. Once you had the plates, you run them into the bill printer that would check a number on the plate, press the plate on the bill, make a carbon impression on the invoice - but do it a couple of hundred per hour. Pitney-Bowes made those office behemoths back then, and Friden made the calculators that could do add, subtract, multiply, and divide to 15 or 16 places. They also had bill-cutter hydraulic presses that could cut a stack of bills a couple of feet thick in one swipe. They also had bill-enclosing machines that would take a stack of empty envelops and a stack of bills and merge them, fold the bills, stuff the envelope, and moisten the glue on the flap. Literally hundreds of bills per hour. Talk about document management!
 
Paper is still a necessary evil because not everyone has an electronic device to receive emails and text messages. Many companies are stating to charge extra for sending printed bills in the mail and the options for signing up for paperless billing and pay via credit/debit card will soon be the only options available.
 
Actually when I started this thread I was far more focused. I can tell many tales of time gone by but I looking at the logic of what do you do when you have no system? Even if your have secure ack-ups of your data and apps, who knows where and how to find them and how do we work without the IT system?

And, of course, back-up and recovery systems are no good whatsoever if you haven't proved and tested them regularly !
 
Even for small orgs a business continuity plan is needed - not just for the IT systems. Fires and floods can affect physical records too! and then who is the back up for performing roles if key personnel are lost without warning? And then you have to maintain the plan, check it is workable and keep a backup(s) that are secure - because they contain or provide key information about how to access to accounts and systems: that themselves need to be physically and electronically secured.
 
Even if you have a disaster recovery plan on paper, how can you quickly get your business back up and running if you don't have the equipment, personnel and customers available? When hurricane Katrina devastated New Orleans, how long did it take for businesses to resume operations? How many permanently went out of business, despite having a backup of all their records? It's not just the preservation of records, you have to have workers, customers, other business partners, suppliers, equipment, the infrastructure of your area, and the areas you service ALL FUNCTIONING. That's the big picture!
 
Last edited:
Actually when I started this thread I was far more focused. I can tell many tales of time gone by but I looking at the logic of what do you do when you have no system? Even if your have secure ack-ups of your data and apps, who knows where and how to find them and how do we work without the IT system?

And, of course, back-up and recovery systems are no good whatsoever if you haven't proved and tested them regularly !

The U.S. Navy Reserve, as part of the Continuity Of Operations Plan (COOP, and yes, that was the official acronym), performed a yearly week-long test of switching operations and running from our stand-by site. Our first line of backup was an online data replication setup in which some (very) smart disk controllers made copies of all of our disks on external RAID-1 sets (over and above the failsafe RAID-1, or "mirror" disks that had always been part of the main system). These controllers "knew" which disk blocks had been updated recently and would maintain the extra RAID-1 set, from which ANOTHER set of controllers would make remote copies over a network with decent speed, enough that it was rarely more than a few tens of seconds behind at busy moments. We actually ran tape backups off those extended RAID-1 sets.

When we were going to do a scheduled switch-over test, we told the main computers to gracefully shut down - but the extra RAID set became busy as all heck while final catch-up replication occurred. Then we remote-rebooted the standby systems, changed some network pointers from the main to the standby system (using "alias" network names), and off we went. Then a week later we reversed everything and returned function to the primary site. When Hurricane Katrina popped up, our standby site in Ft. Worth was far enough away as to have clear skies while the main site got savaged. The total site replication took about 4 hours to shut down, play disk catch-up (the biggest part of the 4 hours), and reboot into standby systems. We had a few minor issues that at that time were resolvable within about three days (minor licensing capacity issue for long-term activation, all it took was money and getting a different license key). But the Navy Reserve computers worked just fine for the 9-10 months we ran "swapped."

The aftermath was that our 25-project hosting site became an 80-project hosting site because we had shown that we KNEW how to maintain safe hosting for major projects. We more than doubled our staffing and our commanding officer, Capt. F. M., became a popular speaker on the U.S. Navy seminar circuit for a while, the lucky stiff. But for us, the benefit was relative job security for quite a while after that hurricane that nearly wiped out New Orleans in August, 2005.
 
Even if you have a disaster recovery plan on paper, how can you quickly get your business back up and running if you no longer have the equipment, personnel and customers available? When hurricane Katrina devastated New Orleans, how long did it take for businesses to resume operations? How many permanently went out of business, despite having a backup of all their records? It's not just the preservation of records, you have to have workers, customers, other business partners, suppliers, equipment, and the infrastructure of your area, and the areas you service functioning. That's the big picture!

@BlueSpruce - I live in the 'burbs to the west of the city of New Orleans and was intimately involved in Katrina issues as applied to the U.S. Navy Reserve - see post above for specific story.

The estimates made for New Orleans business recovery was that it would take 20 years minimum to return to "full activity" (whatever that meant). The fact is that due to infrastructure issues, it is taking longer. The COVID pandemic didn't help. What ALSO didn't help was that a LOT of folks just abandoned their homes as lost causes, too expensive to repair (even when dealing with insured properties). Between insurance companies folding and declaring bankruptcy, plus scammers who soaked up insurance payouts for essentially no work, a lot of folks in the New Orleans lower 9th ward just moved to another city. Houston caught a lot of folks, but that part of the city is STILL blighted by folks who left and never came back.

Even in my area, where we had standing water 2 ft. deep for 3 weeks, rebuilding wasn't fast. It took us 14 months to get the house back to a livable state and 12 of those months was spent looking for a contractor. While I was in Ft. Worth, my wife lived with her mother in our suburb. Her mom's house was on 3-ft. cinder-block piers and so was never flooded. Minor roof damage but otherwise livable. Wifey reported another problem that slowed everything down... building supplies. She was a frequent customer at a building and hardware national chain store. In idle conversation with one of the clerks, she learned that every day, 18 tractor-trailer rigs would drive up to the loading docks and drop off 18 full trailer's worth of stuff - but by about 6 PM that evening, 99% of what they dropped off would be sold and people left every day, disappointed that there was nothing left to sell. We were lucky because we had the ability to be patient and careful.

A lot of businesses didn't return. Some non-chain restaurants in the Gentilly area (it shows up on Google maps) never came back due to having 8 feet of water in the buildings, which represented a total loss of gas & electric appliances and other types of infrastructure. Draining that water took so long that even stainless steel fixtures didn't fare so well due to chemicals that made the water toxic and acidic. Stainless steel cafeteria furniture had holes all through it. And it was once one of the better Gentilly area sandwich/lunch shops, with a really big daytime business.

The biggest "gotcha" has been that the failure of the drainage system led to persistent flooding that weakened the ground (turned firm soil into slush), thus changing the structural stresses on the water, sewerage, and underground utilities. When pipes started floating rather than resting on solid ground, they FLEXED - which metal pipes don't like. For the first five years or more after Katrina left, it was estimated that well over half the purified water going into the water infrastructure was leaking out, leading to sinkholes and failing streets. Right now, due to a REALLY bad mayor, New Orleans is estimating a $160 million shortfall for next year's budget, including continued road and pipe repair. In the past five of six years we have seen quite a few buildings partially collapse, usually brick facades, due to "flexible" soil foundations. You might say that it should have been fixed by now, but there are only so many folks available to do the work and we are talking about a whole city. Plus it was literally BILLIONS of dollars of infrastructure damage. And, of course, the construction scammers came by to pick up their share.

Hope that gives you a good overview.
 
From 1988 to 1994 I worked at Martin Marietta Data Systems in Orlando. We maintained an IMS db and a VS COBOL MRP application, running on an IBM 3090 mainframe, that tracked over, short, and scrapped parts used for manufacturing missles. We had 3 stand by sites with identical 3090 mainframes. One of these sites being in the Rocky Flats Plant near Denver, Colorado that at the time was producing plutonium triggers. The other two sites were in locations which to present day I cannot disclose because we were all blindfolded when being transported to and from them. Daily image backups were flown to all 3 sites on a daily basis. I experienced many random drills where we had to fly out to each one of these sites and continue where we left off, as if a nuclear attack had wiped out our Orlando Data Center.
 
Last edited:
You might say that it should have been fixed by now, but there are only so many folks available to do the work and we are talking about a whole city. Plus it was literally BILLIONS of dollars of infrastructure damage. And, of course, the construction scammers came by to pick up their share.
Hope that gives you a good overview.
Yes, good overview. I imagined the status quo being more or less how you described it. The affected area is so huge that the geological damage, such as the soil subsidence, may be impossible to fix. I would imagine many buildings are abandoned and the population has decreased. The fact that N.O. is situated on a low-lying flat coastal marshland at the river delta, and sea levels continue rising, means the prognosis is not good.
 
Surprisingly, the New Orleans population went through a rebound just after COVID wound down, but the estimates are that for the city proper, we are still about 30,000 people short of the population before Katrina.

As to your sites in Colorado, obviously they were T.S. or T.S./C - but for us, non-operational personnel functions were never higher than Secret. I can say that my primary machine WAS very indirectly involved in Operation Desert Shield/Desert Storm, because my machine was the one that, among other things, implemented reserve mobilization and demobilization.

MOST of the time, what we did was no higher than "Public Trust" clearance. As to the actions leading up to and during that Iraqi war, I never knew (or cared to know) the numbers. But I could see system activity statistics that I monitored regularly. They reflected increased activities in the interactions with other agencies related to travel and to pay adjustment issues.
 
Surprisingly, the New Orleans population went through a rebound just after COVID wound down, but the estimates are that for the city proper, we are still about 30,000 people short of the population before Katrina.
I admire the resilliency and spirit of people who remain and rebuild in devastated places. Is N.O. now better equipped and prepared to sustain another event like Katrina?
As to your sites in Colorado, obviously they were T.S. or T.S./C - but for us, non-operational personnel functions were never higher than Secret.
Yes, they were "Top Secret/Special Access Required" that could only be accessed with a specific code word clearance. The two sites we were taken to, blindfolded and wearing headsets that played rock music, had no windows. For all I know, we could've been inside a mountain, deep down a mine, or under a lake.
 

Users who are viewing this thread

Back
Top Bottom