It all started with one problem
I was helping a friend with a specific problem in MATLAB, a common programming language used at our school, involving fluids. As we struggled with the problem for about an hour, we discovered the code itself was about fifteen years old. We were both shocked from how old the problem was. How can schools prepare students to tackle tomorrow's problems with dated coursework?
Throughout my college career, some of my classes used aged equipment and course material to teach students. This wasn't a problem for first and second year students because it was introductory information. These students had to learn the basics before anything else, but what about the juniors, seniors, advanced placement students, fifth years, etc?
There seems to be a gap between what we are taught and what is relevant to learn. Fifteen years doesn't seem like a long time, but within the technology community, many things could change within that time frame. Many of my teachers explained that extra funding and grants for new technology was focused to specialized courses, which were only offered to master, PH.D, or research students. Shouldn't students be exposed to current, challenging issues early in their coursework to entice them to even higher education opportunities?