vba_php said:
and not one damn class had anything to do with what employers want!
Adam, you have touched on something that I can directly address and I would like you to know that I'm with you on this topic, except that I have some exculpatory comments to take SOME of the heat from colleges.
NOTE: Apologies because I see that I got a bit long-winded. But this turns out to be a hot-button topic for me.
Here is the story behind my viewpoint. When I started work for the U.S. Navy Reserve in 1988 as a contractor, we had a few briefings we had to attend yearly on topics of operational and document security. There were briefings on foreign travel and how to maximize your safety. These usually lasted about 30 minutes to an hour once per year and they were never scheduled in such a way as to make the whole day blocked off from productive work. But eventually the Navy put some of those briefings online using a training system that tracked who had take what "refresher" courses. So we could do that at slack moments if we had any.
Then about 1997 or 1998 we started seeing an uptick in foreign hacking. The Navy started getting serious about security clearances and more formal training. We were required to obtain certifications in various topics. We had to study things that were not normally taught in college. Among other things I had to get a Security+ certificate (which I got through CompTIA) and a Certified Systems Administrator for OpenVMS certificate (which I got from HP). Again, not terrible. But the Navy didn't pay for my time to do the training. They made it a condition of continued employment AND all new candidates had to get those certificates (or the corresponding Certified Systems Administrator for Windows or UNIX or VMWare as appropriate to the position for which they were applying.) My company found a loophole so that I could at least be reimbursed for the exam if I passed.
In the mid-2000s, I'm thinking 2006 or 2007, stuff ramped up again and we had to renew our Security+ certificates every 3 years. Again, no pay for the study time but at least we got paid for the exam cost. That is also when I had to get my Secret clearance. More online training but we could do that at slack times at work and it was expected.
But the thing that finally got to me was that in about 2015 they started piled on a yearly requirement that every system admin and certain other technical types (DB Admins WERE included) had to obtain 40 CEUs in computer-related topics AND they still kept the 3-year Security+ cycle. That requirement went in place as a hard requirement the year I was thinking about retiring and that requirement finalized my decision. That "requirements creep" would NOT have been something for which I could get ANY reimbursement for my time. In other words, my 8-hours-a-day job suddenly became a lot more because of the study requirements AND the cost was out-of-pocket. I was already two years beyond normal retirement age (which I did because of social security impact) and my liver issues, though improving, had taken entirely too much out of me. I just didn't have the stamina I had before the liver issues started and to be flat honest, I could not face that ramp-up in educational requirements. I was too tired.
But let's look at the other side of that problem. The kids coming out of college have to get a general education because not everyone is going to the same disciplines. So the colleges have to skim over ALL of the possible disciplines. States require colleges to teach language, social studies, math, and some type of technical studies. Typical college programs are something like 120-140 credit-hours and you can only earn maybe 18 hours per semester (or at least it was that way when I went through college) and 18 hours was considered a heavy course load. You have nominally 8 semesters if you don't go to summer semesters so a 15-hour load times 8 semesters is 120 credits. If your major required more, you had to either do summer classes or take more than 15 hours a semester. And you had to fit in something like 12 hours of English, 9 hours of math, 9 to 12 hours of history or geography or government, 6 hours of a foreign language, and a few miscellaneous courses. Over 30% of your available hours would be gone even before you started taking courses for your major program. But then, that program would involve secondary requirements. For me, it was at least another 6 hours of a foreign language, another 12 hours of math including at least two statistics courses, at least 12 hours of physics or biology even though I was in the chemistry program... it all added up. Less that 50% of my bachelor's program could concentrate on my major.
Then, the kids have to have SOME foundation to "climb the ladder." To understand programming, you need SOME appreciation of what is going on in the hardware. Maybe not much - but some. So there goes a few more credit hours to learn the basics. I can tell you with absolute certainty that to a kid still in school and trying to learn, the details of such things as "Security Access Arbitration" and "Recursive Path Analysis" are totally foreign. Without a good foundation, kids flounder around and get lost in the haze. I know of several chemistry students who changed majors because of all the things they needed. UNO didn't have a Computer Science major back then, just some technical electives. I took everything they offered.
Here is the ultimate "gotcha." You never know what you are going to do at work until you actually get a job. My first "real" job was for a company that made Real-Time Supervisory Control and Data Acquisition systems, or SCADA for short. Because I had a background in operating systems from college (that's a long story in its own right), I became a device-driver writer for the SCADA company that made their own proprietary telemetry hardware. But there is no way I would have known where I was going to work so there was no way to prepare for the specifics. And that is why colleges HAVE to take the broad-brush approach. You never know what your job REALLY involves until you walk in the door.
Some places want security people. That subdivides into O/S security and network security - two different sets of skills. Some folks want database administrators or systems administrators - again, different skill-sets. Then there are web-site managers Jon could comment on the skill sets he has had to pick up, and I'll bet it is a formidable list. Every employer for whom I have ever worked always understood that it usually took a minimum of six or seven months for a new employee to actually learn the job and become productive enough to not need too much more hand-holding. And that has been true for at least eight different employers. (Navy contracting is good work but the environment can be unstable, which is why so many different employers.)
Adam, think about it. How many different jobs have you had and how many different skills have you had to pick up? My point here is that you cannot expect a college to prepare someone for a job when the student him/herself doesn't even know yet where they will be working or on what. And employers want those skills but don't want to pay for them. THERE is a MAJOR disconnect. And that is why headhunters target disgruntled employees in hopes of getting them to hop to a job in a similar industry. Not for industrial espionage, but simply to get folks who have a more relevant skill-set in order to reduce the learning curve and get better results faster.
I don't think you can blame the colleges as much as you would blame the potential employers who hope to get "something for nothing" - by demanding that kids have more education than a bachelor's program has any chance of providing.