Replacing my Synology NAS - Shaving my home lab Yak

Yak Shaving

I've heard the term for years, and the Wiktionary traces the English entymology of "yak shaving" here:

Coined by Carlin Vieri in his time at the MIT AI Lab (1993–1998) after viewing a segment at the end of a 1991 episode of The Ren and Stimpy Show.[3] The segment featured “Yak Shaving Day,” a Christmas-like Holiday where participants hang diapers instead of stockings, stuff rubber boots with coleslaw, and watch for the shaven yak to float by in his enchanted canoe.

Wiktionary link

The MIT CSAIL project noting a members invention of the term and expanding with some examples:

You see, yak shaving is what you are doing when you're doing some stupid, fiddly little task that bears no obvious relationship to what you're supposed to be working on, but yet a chain of twelve causal relations links what you're doing to the original meta-task.

Jeremy H. Brown - MIT CSAIL email

In the case of technology, it's exemplified by a series of tasks that are impedements toward the original goal that were unanticipated in the beginning.

My attempt to reduce the duplicate and hard-to-find data contained in my multiple systems user $HOME directories in my home lab has been a recent Yak Shaving event of my own. Trying to use a NAS as a proper network storage location has quite a trail.

Background

My first Synology NAS was a DS416+ (the "16" means the 2016 model). It has been a great NAS since I first installed it - much more reliable than if I had been "rolling my own" as I had done in the past. The Synolgy RAID option - Synology Hybrid RAID (SHR) - is based on Linux LVM and md constructs under the hood, so it would be possible to roll my own in this case but I knew that the Spousal Acceptance Factor would come into play and I would be forever in trouble if it lost family documents (photos, etc.).

Over time, I started using it more than a NAS for storage. I played with it as running containers (PiHole), and as a print server, but the CPU and RAM of the DS416+ was just limited enough that the NAS portion was having issues. I thought about adding RAM, but that was not designed as a user-servicable part so I left it alone. What I DID start using it for as a NFS4 server so my Linux workstations had my users $HOME directory consistent across any system I setup.

I don't know why I resisted doing this for so long. My home directory on each physical or virtual machine was basically laid out the same, but often one system would get a handy script or a file that I needed elsewhere and I then had to search across multiple systems to find it. And once I got the AutoFS feature of my system setup, I don't have to fiddle with the NFS mounts manually as I add other test user accounts to my network.

So it was going well enough... Until. (You knew that was coming - otherwise this would be a short post.)

A few months ago I started having problems with the $HOME mount. I had done some upgrading to my main workstation and thought that I had meessed up the NFS configuration so I spent quite a bit of time troubleshooting that. And at some point a couple weeks ago it "Just Started Working(tm)" again. SUCCESS! Yay! And for once I didn't muck around with it.

We had a storm one night that flickered the power for a minute. I don't have a UPS on the NAS - I probably should - but it hasn't been a high priority. In the past the Synolgy system would recover just fine - I had it setup to stay powered off until I pressed the power button (just in case the power was flickering a lot), so when I turned it on a day later it's power button blinked blue a few times. That's normal.

And continued blinking...even as I let it sit in this "booting" phase for a day.

I tried pulling the power completely, pulling and re-seating the HDDs, etc. (There's not a lot you can really troubleshoot from the outside.)

It was dead. I guess after nearly a decade of work, I couldn't complain. But now that all my files were "safe" on the NAS, my $HOME folder with all my 'stuff' was unavailable as well...

Thankfully my wife (our Chief Financial Officer of the house) had some funds allocated for my "lab" and I was given the green light to purchase a replacement.

I'll take a quick side-bar for a moment. Within the past year Synology has updated their devices firmware and/or operating system and they are changing what drive models they will support in their latest NAS devices. I haven't read the details, but in the 2025 models they now require "Synology" brand drives. In the past I had always purchased NAS rated Western Digital "Red" or Seagate "Iron Wolf" drives. They are "NAS rated" (apparently) because they run at a slower RPM to reduce heat, and are designed to be running 24x7. Because of this rating, their price-per-GB is more than a normal drive that would be used in a desktop system. And until this Synology update, you could put in nearly any HDD into their chassis and it would work. The Synology system might put up a warning that the exact drive hadn't been validated by them as a "NAS rated" device, but you'd get the storage space. I had been tempted but always bought NAS rated devices from the list that Synology produced - slowly upgrading from the initial 3x500GB to the 3x8TB+1x4TB system I have today. (I am fine on free disk space at the moment. The single 4TB will be replaced when it starts showing any signs of failure - and the SHR will expand and I'll have even more space.)

So, when I went looking for a new Synology, I chose to stay back a model year and went with the DS923+. I actually thought I was buying the DS423+, but I somehow got on the wrong page at NewEgg and, well... Thankfully, the price was nearly the same (about $60 difference), and for that there are a few other features I probably won't use.

To keep this shorter - the other details are in separate blog posts - once I moved the drives to the new DS923 system and powered up the NAS, there was a brief time while it upgraded the OS image on the HDDs (probably 20-30 minutes), and then the NAS came back just fine. And my $HOME on the NAS is working just fine again.

Next parts

While waiting for the new NAS to come in I decided to use this time to get a few more things done in my lab network that I had been putting off being too disruptful: setup the NAS and switch to support bonding (2 x 1Gb), setup my lab on a separate IP range and VLAN, etc.

I plan to write up each of these - during the NAS recovery I got most of that setup - in separate articles I'll link to here.