1 minute read

Mike Taulty: …when I went to University (here, starting around 15 years ago) we did some practical computing work but the emphasis was on computing theory. So, we did compilers, languages, databases and so on but we spent more time on computability, coding theory, linear algebra, automata, grammars, algorithms and complexity analysis and so on.

The thing that’s always struck me about this is that during my career these things remain constant whilst the technologies around change on a constant basis - I wonder what happens “tomorrow” if you spend 70% of your time learning how to do today’s technologies?

When I was at university (for the record here) one of the criticisms I had was the lack of practical teaching. It felt like few of the lecturers had real world experience and that they were churning out graduates with little more than pure mathematics degrees and little actual programming skill. I often wondered how most of the students would fare in a real software development role. I thought I gained more practical knowledge on my own time in the labs than I ever did in lectures.

Now, after more than 10 years in this industry, I find myself often referring back to topics covered during those lectures. Understanding that an algorithm is O(n^2) or knowing the theory underlying concurrency issues with threading or databases is something I now treat as common knowledge/sense. It isn’t though - it’s that theoretical background shining through. Today I’m ever grateful for the time spent looking at things that seemed somewhat pointless at the time.

Updated: