“Seriously? Why is This Not Taught in Computer Science at University?”

November 25, 2018

The benefit of tertiary education — and its undeniable continuous devaluation over the years — is a broad and heavily debated topic. Whether getting a university degree is a worthwhile endeavor from an educational point of view or whether it’s just effort that directly translates into a piece of paper employers expect you to have is for every individual to decide. During my time on the computer science bench, I have often asked myself “why on earth do I have to study this when they could be teaching that instead?” The university’s priorities did not seem to align with mine, or with workplace requirements for that matter. In this post, I’m going to explore a number of topics that haven’t been taught at university when they really should have been.

University cannot teach you everything! It's your own responsibility to learn the rest.
Coding Sucks!

Yes, yes, I agree. Let me explain. I have spent a considerable amount of time in university studying computer science and software engineering — roughly a quarter of my life — and I can still be found on campus for one reason or another every once in a while. Having had the chance to attend university in three different corners of the world (Europe, China, and Australia), I had the opportunity to experience different systems of tertiary education and form my own opinions on their quality and usefulness.

I chose to attend Universities of Applied Sciences that primarily focus on industry-relevant skills. Keep this in mind for the rest of this post.

In contrast to most of my peers who were full-time students, I’ve decided to ride on two lanes simultaneously and study part-time while also working as a software developer. This had the invaluable advantage of getting work experience on the go. It was also when I had to learn the hard way that what is being taught at university doesn’t always align with industry requirements, despite what educational marketing departments are claiming.

Proper Teamwork

Let’s start with arguably the most essential skill in software development. Teamwork. Teamwork is hard. Being able to work together in larger teams efficiently and effectively is challenging and requires lots of practice — something most student’s don’t have. I’ve never seen a professor that does not emphasize the importance of teamwork, but have yet to encounter a university assignment that actually requires students to work together in unison to achieve something greater than the sum of its parts. Generally, work was distributed among members of a team when the assignment started and things were then bolted together at the end to form an incoherent patchwork mess. More often than not, contributions were so atrocious that I found myself in the unfortunate situation of rewriting parts from scratch just to save my own grade.

The real issue here is the standard practice of giving every student in a group the same grade, regardless of time invested and contribution quality. Because hey, why wouldn’t some students take this free pass? Someone’s gonna do it and pull them through, right? Ambivalent This is a sad reminder that Price’s Law also applies in this context.

Error Handling

Almost all practical tasks in university focus solely on the implementation of the happy-path and rarely force students to think about points of failure beyond basic conceptual exception handling. This approach is generally followed in the name of simplicity, which makes sense for the explanation of an algorithm or to keep pieces of code concise. However, not learning how to handle and deal with errors properly will inevitably bite you in the rear, especially in the workplace.

I would have liked to see more emphasis on the importance of error handling and reporting. Maybe techniques like aspect-oriented logging?

Testing / Unit Testing

Software testing at university seems to be limited to simple yes/no answers. Given input abc, is visible result xyz computed? This is — of course — sufficient for grading an assignment. It’s a different story for post-uni production grade software for which proper testing is a necessity.

I remember that we’ve briefly touched the concept of unit testing and wrote basic test fixtures for simple pure functions. Things like faking data or mocking subsystems were never even mentioned and neither was the actual purpose of unit tests, which is not verifying correctness but preventing the introduction of bugs into the existing code base (regression test) when changes are made. Then again, we never worked with anything substantial…

Deployment & Maintenance

It’s always fun and straightforward to work on small, isolated snippets of code that have no connection to a larger system. Unfortunately, realistic scenarios generally involve working on a large pile of code that has been meticulously malformed over the years by a large number of developers with different preferences, practices, and skill levels.

I can’t recall a single instance where we had to touch an existing code base at university. Everything was always clean and sanitized, ready for precision surgery. It would have been a great experience diving into something real for once and implement features that outlast the assignment’s deadline — work not to be forgotten as soon as the grades are in.

I think there’s a missed opportunity here for universities and students to participate in open source projects.

Thinking Out of The Box

I have noticed that professors tend to strictly adhere to the content of their lectures (which they’ve probably been teaching in the exact same way for the last five years) instead of encouraging students to go beyond and think out of the box — which is supposed to be the very purpose of universities.

The field of software engineering is moving forward at an incredible pace, and it is understandable that universities lag behind. Solid theoretical foundations are important and thankfully remain relevant over decades, so not using the newest-kid-on-the-block programming language is completely fine and appropriate. However, this is a different case when it comes to more practical and contemporary topics, and professors are crossing a line when they start talking about VBScript and ASP (w/o .net) in a course on web development. Angry This is just one example of many. As a professor, being unwilling to shift focus and move with the time sends a message. At least allowing the use of alternative technologies to achieve the same course goals would have been acceptable, but of course this is “too difficult to grade”.

My home university pledged allegiance to object-oriented programming with Java and was obsessed with the Gang of Four Design Patterns to a point where I felt being part of a cult. The course was… good? Perfect balance of theoretical and practical aspects, good resources, and excellent teachers. So, what’s the downside here? The downside is the opportunity cost we paid for not having been exposed to alternative paradigms and mindsets — at all.

Miscellaneous

To finish up, here’s a vague list of a few more topics I think would be valuable to be part of a university course in computer science / software engineering. Or well, at least they could be mentioned.

  • The intricacies of Unicode and internationalization caveats.

  • Correct handling of dates, times, and time zones.

  • Real-time concepts and applications (e.g. games or streaming).

  • Interoperability, getting heterogeneous systems to work well together.

  • Optimization and reduction of overhead.

…gotta learn it the hard way, I guess.