Blinding Glimpse of the Obvious

DickyP

Active member
Local time
Today, 08:38
Joined
Apr 2, 2024
Messages
512
Quote from today's papers.

"Cyber attack contingency plans should be put on paper, firms told"​

Can anyone think of a reason why they shouldn't be doing this already?
 
Somewhere along the line, hard-copy contingencies became obsolete.
 
Somewhere along the line, hard-copy contingencies became obsolete.
Yeah, perhaps I should have phrased the questions as "Can anyone think of a GOOD reason why they shouldn't be doing this already?"
 
Agreed. That is what I was trying to imply, hard to convey sarcasm in text!
 
Back in the 2000-2009 decade, the U.S. Navy started their Continuity of Operations plan which included building an alternate operations site, beefing up the network long-haul segments, setting up data replication to the alternate site, and arranging for alternate site consoles to be visible on an "inside" network so that both primary and secondary sites could be controlled from the same place. Of course, the contingency plans WERE formulated in Word and Excel. We had a Lt. Commander in charge of that project. He worked from paper only in the weekly status meetings.

MANY years ago, in my first "real" job, a co-worker named Wayne gave me a little bit of wisdom. "Paper has a better memory than people.." I took that to heart while I was still in the petroleum industry.

Our design guide was always on paper and organized so that each major component could be updated easily without having to reprint the whole document. So... 3-ring binders and 3-hole pre-punched paper, plus card-stock separators with color-coded tabs between each chapter. However, once I moved to the Navy job, their rules were different and EVERYTHING was electronically transferred. "When in Rome, ...."
 
Major cost savings in terms of the machines, I used to work for a dept in wells fargo auto finance called Document Management. And wow, the level of sophistication of the machines that were used for mass printing, mass copying, mass scanning -a lot of that was let go with the digital age. Those machines were SO troublesome, you could have the most expensive machine in the world but the "scanner guy" still had to come out every week and sit there for hours troubleshooting, fixing, billing...
 
In the 1960s, my mother worked for Southern Bell (yes, that was the name back then, not Bell South as it is now.) Their equipment included some incredible machines for creating metal plates for bill addressing - as in, align the plates, press down, type a name, address, city, state, etc. Once you had the plates, you run them into the bill printer that would check a number on the plate, press the plate on the bill, make a carbon impression on the invoice - but do it a couple of hundred per hour. Pitney-Bowes made those office behemoths back then, and Friden made the calculators that could do add, subtract, multiply, and divide to 15 or 16 places. They also had bill-cutter hydraulic presses that could cut a stack of bills a couple of feet thick in one swipe. They also had bill-enclosing machines that would take a stack of empty envelops and a stack of bills and merge them, fold the bills, stuff the envelope, and moisten the glue on the flap. Literally hundreds of bills per hour. Talk about document management!
 
Actually when I started this thread I was far more focused. I can tell many tales of time gone by but I looking at the logic of what do you do when you have no system? Even if your have secure ack-ups of your data and apps, who knows where and how to find them and how do we work without the IT system?

And, of course, back-up and recovery systems are no good whatsoever if you haven't proved and tested them regularly !
 
Even for small orgs a business continuity plan is needed - not just for the IT systems. Fires and floods can affect physical records too! and then who is the back up for performing roles if key personnel are lost without warning? And then you have to maintain the plan, check it is workable and keep a backup(s) that are secure - because they contain or provide key information about how to access to accounts and systems: that themselves need to be physically and electronically secured.
 
Actually when I started this thread I was far more focused. I can tell many tales of time gone by but I looking at the logic of what do you do when you have no system? Even if your have secure ack-ups of your data and apps, who knows where and how to find them and how do we work without the IT system?

And, of course, back-up and recovery systems are no good whatsoever if you haven't proved and tested them regularly !

The U.S. Navy Reserve, as part of the Continuity Of Operations Plan (COOP, and yes, that was the official acronym), performed a yearly week-long test of switching operations and running from our stand-by site. Our first line of backup was an online data replication setup in which some (very) smart disk controllers made copies of all of our disks on external RAID-1 sets (over and above the failsafe RAID-1, or "mirror" disks that had always been part of the main system). These controllers "knew" which disk blocks had been updated recently and would maintain the extra RAID-1 set, from which ANOTHER set of controllers would make remote copies over a network with decent speed, enough that it was rarely more than a few tens of seconds behind at busy moments. We actually ran tape backups off those extended RAID-1 sets.

When we were going to do a scheduled switch-over test, we told the main computers to gracefully shut down - but the extra RAID set became busy as all heck while final catch-up replication occurred. Then we remote-rebooted the standby systems, changed some network pointers from the main to the standby system (using "alias" network names), and off we went. Then a week later we reversed everything and returned function to the primary site. When Hurricane Katrina popped up, our standby site in Ft. Worth was far enough away as to have clear skies while the main site got savaged. The total site replication took about 4 hours to shut down, play disk catch-up (the biggest part of the 4 hours), and reboot into standby systems. We had a few minor issues that at that time were resolvable within about three days (minor licensing capacity issue for long-term activation, all it took was money and getting a different license key). But the Navy Reserve computers worked just fine for the 9-10 months we ran "swapped."

The aftermath was that our 25-project hosting site became an 80-project hosting site because we had shown that we KNEW how to maintain safe hosting for major projects. We more than doubled our staffing and our commanding officer, Capt. F. M., became a popular speaker on the U.S. Navy seminar circuit for a while, the lucky stiff. But for us, the benefit was relative job security for quite a while after that hurricane that nearly wiped out New Orleans in August, 2005.
 
Even if you have a disaster recovery plan on paper, how can you quickly get your business back up and running if you no longer have the equipment, personnel and customers available? When hurricane Katrina devastated New Orleans, how long did it take for businesses to resume operations? How many permanently went out of business, despite having a backup of all their records? It's not just the preservation of records, you have to have workers, customers, other business partners, suppliers, equipment, and the infrastructure of your area, and the areas you service functioning. That's the big picture!

@BlueSpruce - I live in the 'burbs to the west of the city of New Orleans and was intimately involved in Katrina issues as applied to the U.S. Navy Reserve - see post above for specific story.

The estimates made for New Orleans business recovery was that it would take 20 years minimum to return to "full activity" (whatever that meant). The fact is that due to infrastructure issues, it is taking longer. The COVID pandemic didn't help. What ALSO didn't help was that a LOT of folks just abandoned their homes as lost causes, too expensive to repair (even when dealing with insured properties). Between insurance companies folding and declaring bankruptcy, plus scammers who soaked up insurance payouts for essentially no work, a lot of folks in the New Orleans lower 9th ward just moved to another city. Houston caught a lot of folks, but that part of the city is STILL blighted by folks who left and never came back.

Even in my area, where we had standing water 2 ft. deep for 3 weeks, rebuilding wasn't fast. It took us 14 months to get the house back to a livable state and 12 of those months was spent looking for a contractor. While I was in Ft. Worth, my wife lived with her mother in our suburb. Her mom's house was on 3-ft. cinder-block piers and so was never flooded. Minor roof damage but otherwise livable. Wifey reported another problem that slowed everything down... building supplies. She was a frequent customer at a building and hardware national chain store. In idle conversation with one of the clerks, she learned that every day, 18 tractor-trailer rigs would drive up to the loading docks and drop off 18 full trailer's worth of stuff - but by about 6 PM that evening, 99% of what they dropped off would be sold and people left every day, disappointed that there was nothing left to sell. We were lucky because we had the ability to be patient and careful.

A lot of businesses didn't return. Some non-chain restaurants in the Gentilly area (it shows up on Google maps) never came back due to having 8 feet of water in the buildings, which represented a total loss of gas & electric appliances and other types of infrastructure. Draining that water took so long that even stainless steel fixtures didn't fare so well due to chemicals that made the water toxic and acidic. Stainless steel cafeteria furniture had holes all through it. And it was once one of the better Gentilly area sandwich/lunch shops, with a really big daytime business.

The biggest "gotcha" has been that the failure of the drainage system led to persistent flooding that weakened the ground (turned firm soil into slush), thus changing the structural stresses on the water, sewerage, and underground utilities. When pipes started floating rather than resting on solid ground, they FLEXED - which metal pipes don't like. For the first five years or more after Katrina left, it was estimated that well over half the purified water going into the water infrastructure was leaking out, leading to sinkholes and failing streets. Right now, due to a REALLY bad mayor, New Orleans is estimating a $160 million shortfall for next year's budget, including continued road and pipe repair. In the past five of six years we have seen quite a few buildings partially collapse, usually brick facades, due to "flexible" soil foundations. You might say that it should have been fixed by now, but there are only so many folks available to do the work and we are talking about a whole city. Plus it was literally BILLIONS of dollars of infrastructure damage. And, of course, the construction scammers came by to pick up their share.

Hope that gives you a good overview.
 
Surprisingly, the New Orleans population went through a rebound just after COVID wound down, but the estimates are that for the city proper, we are still about 30,000 people short of the population before Katrina.

As to your sites in Colorado, obviously they were T.S. or T.S./C - but for us, non-operational personnel functions were never higher than Secret. I can say that my primary machine WAS very indirectly involved in Operation Desert Shield/Desert Storm, because my machine was the one that, among other things, implemented reserve mobilization and demobilization.

MOST of the time, what we did was no higher than "Public Trust" clearance. As to the actions leading up to and during that Iraqi war, I never knew (or cared to know) the numbers. But I could see system activity statistics that I monitored regularly. They reflected increased activities in the interactions with other agencies related to travel and to pay adjustment issues.
 
Actually when I started this thread I was far more focused. I can tell many tales of time gone by but I looking at the logic of what do you do when you have no system? Even if your have secure ack-ups of your data and apps, who knows where and how to find them and how do we work without the IT system?

And, of course, back-up and recovery systems are no good whatsoever if you haven't proved and tested them regularly !
Several times I have needed to use a backup and realize that the backup file wasn't even workable. Very sad day

Have toyed with several methods of backing up an access back end but thankfully got rid of access back ends at some point and didn't have to worry about that in the future.
I always try to remind people that generally speaking if you work for a company with a network backing up stuff on the network is sysadmins responsibility ...all you have to do is know that there's a backup plan and have an idea of how to avail yourself of it should you need to restore an earlier version. Same thing with SQL server. If you're in a big corporate environment there are other people responsible for backing that up and they'll know how to restore it. Hopefully.

One of the best things for a file server is when you right click on the folder and then go to the previous versions tab and it's actually populated. Then you know you have an easy way to restore a previous versions of files including access backends.
 
One of the toughest lessons for an "average" system admin to learn (and tough for the department boss, too!) is that if you have an active SQL back-end, it probably cannot be active during backup operations. ORACLE, for example, originally didn't have this feature, but after (about) v5 of ORACLE Enterprise Server they had a "Backup Mode" such that if you put the DB into that mode, the normal updates would finish and a "transaction log" mode would kick in. You do your backup and it will be good for all transactions BEFORE the time at which you actually entered backup mode. In order to do this for Access BEs, the trick I used was to wait until the .LACCDB or .LDB file was deleted by the last user exiting from the app. Then I immediately MOVED the BE file to a holding area to do my work directly on the BE file. Since the linked files were at that time not in the folder named in the connection string, nobody could touch them and nothing in the FE cared. It would just bomb out and tell you that it couldn't find the BE file. The key is that databases do random middle-of-the-file operations, but backups are essentially whole-file snapshots. If a part of the file is moving, the snapshot is blurred. Think of family reunion pictures made when there are little kids at the reunion.

That's why all systems HAVE to have a time set aside for crucial off-line work. For the Navy computers, we picked a time matching the western Pacific Ocean time zones to do our offline work.
 
One of the toughest lessons for an "average" system admin to learn (and tough for the department boss, too!) is that if you have an active SQL back-end, it probably cannot be active during backup operations. ORACLE, for example, originally didn't have this feature, but after (about) v5 of ORACLE Enterprise Server they had a "Backup Mode" such that if you put the DB into that mode, the normal updates would finish and a "transaction log" mode would kick in. You do your backup and it will be good for all transactions BEFORE the time at which you actually entered backup mode. In order to do this for Access BEs, the trick I used was to wait until the .LACCDB or .LDB file was deleted by the last user exiting from the app. Then I immediately MOVED the BE file to a holding area to do my work directly on the BE file. Since the linked files were at that time not in the folder named in the connection string, nobody could touch them and nothing in the FE cared. It would just bomb out and tell you that it couldn't find the BE file. The key is that databases do random middle-of-the-file operations, but backups are essentially whole-file snapshots. If a part of the file is moving, the snapshot is blurred. Think of family reunion pictures made when there are little kids at the reunion.

That's why all systems HAVE to have a time set aside for crucial off-line work. For the Navy computers, we picked a time matching the western Pacific Ocean time zones to do our offline work.

We take constant backups of our active SQL database.. but it also involves replication
 
I can anticipate FEMA not providing anymore financial aid, and insurance companies not covering damage in high risk areas. They would probably suggest moving to a safer place that doesn't have a history of repeated calamities.

Obviously you've been subscribing to our newspapers.
 
I can anticipate FEMA not providing anymore financial aid, and insurance companies not covering damage in high risk areas. They would probably suggest moving to a safer place that doesn't have a history of repeated calamities.

as much sympathy as I have for anyone involved in a calamity, I DO really, really question these people who live on the side of a big river, keep building in the paths of hurricanes, and frankly anybody who decides to live in hurricane path. It's not a pejorative viewpoint, more just struggling to understand it. I've thought of a lot of places we might move over the next few years, but crossed off a lot too due to that .... It's just high on my priority list "a place without too many natural disasters", and these places that are like #1, I question how they're so popular to live in. I think people just assume modern technology/engineering will save them but......that's unfortunately not always the case. Mother Nature is one, powerful, mother-

You can't totally avoid everything, but some places are just so popular and they get blown away every 2 years like clockwork. I don't really blame Allstate, Progressive, etc for pulling out. And unfortunately the % of places that are high risk seem to be growing. I suppose maybe I was privileged to grow up in Wisconsin, one place where not much of anything ever happens, other than a lot of cold weather. I even crossed Reno off my list after learning exactly what it was to earthquake predictions. I'm quite interested in St Petersburg and Clearwater FL, but.........not too sure about that.
 
I DO really, really question these people who live on the side of a big river, keep building in the paths of hurricanes, and frankly anybody who decides to live in hurricane path.

Odd you should mention that. From the time I was about 3 years old until the time I was 39, I lived 1/2 block from the Mississippi River levee in "Old Jefferson" (unincorporated suburb west of New Orleans). Not EXACTLY the side of a river, but pretty close. Never had an issue with the river during that time, though we did have a couple hurricanes. "In a hurricane path" makes it sound like you think hurricanes are predictable. Despite what the weather guessers often tell us, there is ALWAYS a pretty wide cone of uncertainty that comes along with their prognostication. We don't have as many down here, but farther inland you have "tornado alley" that is more selective in terms of destruction - but also more likely to leave behind NOTHING that had been in its path.

Maybe it's just my Deep South hind end that doesn't like the cold, but to me a hip-deep snowdrift is just another disaster that has found a cozy corner to sit in.
 

Users who are viewing this thread

Back
Top Bottom