<img alt="" src="https://secure.main5poem.com/217135.png" style="display:none;">
4 min read

Operating Systems 25 Years On

Featured Image

The fondest memories I have of IT relate to installing Linux and *BSD in the 1990’s. Everything was just different back then, Windows 3.1 came on six 1.44MB floppy discs, Windows NT 3.1 came on twenty-two, and Red Hat Linux was in its infancy. My parents had replaced the old 486 with a new Intel Pentium 75 and we had Windows 95 on CD that featured a Weezer video of “Buddy Holly”. Nevertheless, installing Linux was always a battle, the 1.x series kernels were challenging, but things changed with the release of 2.0.x kernel series and battles often related more to XFree86 or LILO or trying to re-compile a kernel with additional modules – All challenges I relished at the time. Red Hat Linux was always a prominent figure in the Linux community. The 2.4.x series really pushed Linux forward for the community despite the controversy regarding some of their decisions at the time that led to flame wars the likes mere mortals from this century can only imagine. I certainly found my love for computer architecture and software engineering by experimenting and developing under *BSD and Linux all those years ago.

Now as we look at Windows 11, Kernel 5.x is stable and even JunOS versions are getting to the dizzy heights associated with seasons of Family Guy.  I feel slightly melancholic around the fact that whilst things are "easier now", they are not necessarily "better".

I liberated some older hardware including around 15 Supermicro servers and a few Cisco 3750G switches. I wanted to build out an OpenStack deployment to test out some Machine Learning code I had been working on (read about it here). Utilising Canonical's MaaS (Metal-as-a-Service) was a dream (as it was all those years ago) - Ubuntu 20.04 installed perfectly on the server; the MaaS install was literally one or two commands - Configuration was minimal. I was up and running within the hour.

With MaaS I could then provision Ubuntu on the remaining 14 nodes within 10 minutes. This type of automation and provisioning is nothing new - Coreix has used this for years via various means, but have we lost something in doing that?

I met one of my colocation customers in the Enfield datacentre - I always pose as another customer, just force of habit, but am always prepared to help, and since I have that "helpful type of face", I inevitably get the question "do you much about this?"

In this instance, the gentleman in question was significantly younger than me and was stuck with the FreeBSD installer, in particular the partitioning. His predecessor, who had recently left, had been a FreeBSD advocate and when he took the job he had "some FreeBSD experience" but not with the installation since he had only provisioned it from a private cloud environment (and not often used it). He had been thrown into the deep end and needed a server with FreeBSD and ZFS with raidz1. Naturally, I helped him help himself and we went through it together and successfully built the system.

I would bet my job there is a whole host of administrators sitting in remote offices around the world that have never even installed the majority of their Operating Systems on bare-metal, many of whom utilise public cloud and have never even seen datacentres or server architecture!

Historically, installing specific Operating Systems and making a stable and up-to-date fully working solution was a "right of passage", now it's a few clicks in a related orchestration platform. This old knowledge that keeps one of my feet in the dinosaur camp helps me keep my other foot in the next generation technologies camp since at the end of the day new technology is rarely more than old technology principles "version 2.0". When you break down the most complicated modern technology it rarely ventures away from fully relatable classic concepts with a new twist.

Let's face facts until we start seeing Quantum Processors and truly next-generation hardware, regardless of what you want to do, all processors do is add up numbers and all memory and disks do is store bits - True of False, one or zero.

Computers are still just big calculators and distributed filesystems are just RAID arrays over a network! I feel that what we have lost is the "requirement by default" to have to get your hands dirty. I can install nginx with python in one or two commands - And it works out of the box!

I think I miss the frontiers and whilst I do work with fringe technologies including Blockchain, machine learning and quantum languages such as Q#, I do feel we have lost some of that romanticism associated with IT and technology - It all feels so clinical now! It is like comparing the first mountaineers of Mt. Everest to more modern expeditions - I will never take anything from anyone who manages such a feat but it isn't the same as the first brave souls who ventured up that mountain!

So, I will sit here with a cup of tea and my photo album of an old 486 or my AMD K6 and remember the painful days fighting with "Yet Another Setup Tool" (YAST - amazing name) and rebuilding kernels to try and get USB to work! While I am sitting here I will wait for Quantum Computers and other true next-generation technologies and mountaineers can wonder what it's like to climb Olympus Mons on Mars!

 

3 min read

The Benefits of Colocation over On-Premise

Organisations strive for business growth, which brings numerous benefits. However, growth also poses challenges,...

6 min read

Hybrid Quantum Computers - Is Hybrid always a good thing?

I was reading through the latest developments at NVIDIA, in particular, their deployment of the NVIDIA Grace CPU...

5 min read

What is DeFi? The future of finance?

DeFi - The good the bad and the ugly

Decentralised Finance is designed for the Internet age. DeFi is an open financial...