Skip to content

Unexpected Linux Adventures

Ubuntu Studio 22.04 on Computer, for a young artist

The young artist in the house, who will soon have an upgraded PC for the new year, will be then running Ubuntu Studio 22.04, made for creative people.

The current system is running Ubuntu Studio 20.04 on a very old system, based on an AMD Phenom II X4 and an Asus M4A89GTD Pro/USB3 from 2010. In preparation for the upcoming upgrade, the process starts by installing Ubuntu Studio 22.04 on a old SSD on my own PC (Rapture), which can later be transplanted to the new PC.

The weirdest corrupted video on an NVidia card

This is the kind of thing that makes you think, this really only happens to me.

Back in June, when the availability and price of graphics card finally approached relatively normal values, I got myself an new ASUS GeForce TUF Gaming RTX 3070 Ti OC Edition (to replace the old ASUS GeForce GTX 1070 STRIX from 2017). It still was still nearly $800 but it was clearly never going to come down to $570 the old one costed back in August 2017.

Then, in September, the new card died. Somewhat surreptitiously...

Undead Yes ─ UnRAID No

My only NAS is my PC. At least, what people would usually do with, or build a NAS for, I just do it with my PC.

Most of my disk storage space is a BTRFS RAID 1 using two 6TB WD BLACK 3.5″ HDD. This setup offers block-level redundancy which is better than the classic device-level redundancy offered by Linux Software RAID or hardware RAID. To keep BTRFS file systems healthy, it is strongly recommended to run a weekly scrub to check everything for consistency. For this, I run the script from crontab every Saturday night (it usually ends around noon the next day).

One Sunday morning, after many successful scrubs, I woke up to both disks failing, each in a different way. But this was not the end of it. And the end of this adventure, disks emerged victorious.

Keeping reading to find out how the disks came back from the dead.

Illustration by Paul Kidby: Zombie leads a small parade of undead citizens with a wooden sign that reads UNDEAD YES - UNPERSON NO

Low-effort homelab server with Ubuntu Server on Intel NUC

Need. More. Server. Need. More. POWER!!!

But only a little bit, maybe just enough to run a Minecraft server, which refuses to start on my Raspberry Pi 4 because it has only a meagre 2 GB of RAM.

I had known about Intel NUC tiny PCs for a while, and how handy they can be to have a dedicated physical PC for experimentation. There was a very real possibility that I would have to set one up as a light gaming PC in the near future, so I thought cutting my teeth on a simpler server setup would be a good way to get acquainted with this hardware platform and its Linux support.

Detailed system and process monitoring

Never got the hang of telegraf, it was all too easy to cook my own monitoring...

Humble Beginnings

In fact, when I started building detailed process monitoring I knew nothing about telegraf, influxdb, grafana or even Raspberry Pi computers.

It was back in 2017, when pondering whether to build my next PC around an Intel Core i7-6950X or an AMD Ryzen 5 1600X, that I started looking into measuring CPU usage of a specific process. I wanted to better see and understand whether more (but slower) CPU cores would be a better investment than faster (but fewer) CPU cores.

At the time my PC had a AMD Phenom II X4 965 BE C3 with 4 cores at 3.4GHz, and I had no idea how often those CPU cores were all used to their full extent. To learn more about the possibilities (and limitations) of fully multi-threading CPU-bound applications, I started running top commands in a loop and dumping lines in .csv files to then plot charts in Google Sheets. This was very crude, but it did show the difference between rendering a video in Blender (not multi-threaded) compared to using the pulverize tool to fully multi-thread the same task:

Chart of CPU usage over time showing a single Blender process never using much more than one CPU core

Chart of CPU usage over time showing how pulverize, running 4 Blender processes, most CPU cores most of the time

This early ad-hoc effort resulted in a few scripts to measure per-proccess CPU usage, overall CPU with thermals, and even GPU usage.