Step back a bit, son, and let me school on the workings of college.
College is purely a business that has been established to create a sense of expectation that a person needs a higher education, and god damned the business is thriving. The talking-heads of colleges/universities are doing their jobs correctly by constantly instilling this concept into incoming freshmen. The degree is just a piece of paper that says "Look at me, I've spent a shitload of cash to take classes in which most of don't even relate to my major". The degree, in my opinion, should mean almost nothing when it comes to getting a job. Unfortunately, this shitty notion that a degree does mean something has become so intertwined with getting a job that it forces capable people without one to move on to less of their potential creating even more problems in this already shitty economy.
Back to final cramming.