
I think I’m just a man with expensive hobbies because two years I ago I got really into skiing — finally cashing out on my own gear and a season pass — and this year I’ve decided to start homelabbing.
And no, I don't mean the type of homelabbing Walter White was doing, I mean more the kind that Lain does:

A homelab is a personal setup of computers and networking devices that are owned and operated entirely by yourself! They can be used to provide services or conveniences on your home network, such as: Network Attached Storage (NAS), a Network-wide Adblocker, personal Streaming Services, and much, much more. Basically like doing sysadmin work — not for a paycheck — but for the sole satisfaction of relying less on expensive cloud platforms and corporate-owned services and more on your own technical skills and hardware.
There are many small projects you can build to start homelabbing, but a personal NAS server is one of the most common places to start. It's like having your very own Google Drive, one that you control entirely. The only storage limit you'll run into is governed by the hardware you have on hand. And the best part is that your files will no longer live in the "cloud" -- fancy term for someone else's computer -- they'll live on your drives and served by your server.
First step of any homelab setup is hardware. No need to break the bank here, just start with whatever you’ve got laying around or source some for cheap (Dell Micro PCs are a great option).
I’m currently living in a place where my Dad used to live, and he left a 2010 Mac Pro 5,1 in the closet. It’s been collecting dust for years. I wanted to start my homelab with a smaller footprint and use something like a Dell Optiplex to run my NAS — but I don’t have one of those laying around — I have this huge cheese grater!

But don’t let its size fool you! Although this thing used to be a workhorse back in its day, it’s sporting:
Intel Xeon W3520 8-core 2.793 GHz CPU
AMD ATI Radeon HD 4870 GPU
8GB DDR3 RAM
This won't be enough compute to do as much as I want, but here's why I find it an enticing machine to start with (aside from the fact that I have it at my disposal).
Modularity: These old Mac Pro towers are extremely modular. It was the last line of computers Apple built that were truly for the computer hobbyist. They are extremely easy to work on and upgrade (often times requiring no screws since things just slot into place), and they have 4 hard drive bays! That's pretty good for a starter NAS setup.
Build Quality: It's Apple. Need I say more?
But of course, with a machine this old, it is not without it's drawbacks.
Dated Hardware: This machine is old. Really old. It's also the single-processor model (dual-processor models offer more robust upgrade paths). I could upgrade the processor to something like a Xeon W3690 (a 6-core, 3.46 GHz chip) and up to 46GB RAM, but that's not in my current plans. Maybe someday.
Power Draw: At idle this machine currently draws roughly 120-150W! That's a lot of power draw for something with such little compute! For reference, it's about 3x what a modern mini PC (like a Dell Optiplex) will draw. Here's some insane napkin math for ya (power is expensive where I live, roughly $0.45/kWh):
Assuming the machine is running 24/7:
Power | Cost | Timespan |
|---|---|---|
3.6 kWh | $1.62 | per Day |
25.2 kWh | $11.34 | per Week |
109.6 kWh | $49.31 | per Month |
1,315 kWh | $591.71 | per Year |
That's insanely expensive for a homelab server! A more modern, power-efficient machine (like an Optiplex) would be way more cost effective (and powerful) in the long-term. But hey,

Quick Note: I didn't compute this until after I had already been tinkering with this machine for a bit. Oops. Don't make the same mistake as me! If you live somewhere with expensive electricity, be sure to consider power draw when picking out homelab hardware.
For the time being, and since I realized this so late in my journey, I will still setup a basic NAS server on this machine to get my feet wet. But I won't be running this thing 24/7. Once I pick up an Optiplex I'll take what I've learned here and migrate my NAS over to that machine. Maybe someday I'll upgrade this tower and use it for something else fun. I wonder if any of the upgrade paths available for this model could lessen the power draw?
As it currently stands, this stock Mac Pro 5,1 is not in my long term plans for my homelab. It doesn't offer enough compute for how much it draws in power. However it can serve as a good starting machine to get my hands dirty. My current plan is to set it up as a NAS server so I can get familiar with administering tech like Samba shares or NFS. Once I pick up something like an Optiplex, or hell, even a Raspberry Pi, I'll migrate the server over to a new machine and decommission the tower — where it will return to collect dust once more. More seriously though, we'll likely keep it around as long as we have the space to store it. If I find myself with money burning a hole in my pocket, it would sure be fun to substantially upgrade it.
Thus, my journey to build the MacNAS began ... with a rough start. The system booted up okay, but wouldn’t successfully log in to any user accounts. It would simply load and load and keep on loading, never dropping me into a desktop environment. I wanted the satisfaction of interacting with an ancient OSX desktop, and opening the “About” page to see the system’s specs. More importantly though, my first mission was to ensure no important data still resided on the machine’s internal HDD. When my dad was using this, he was working on movies he made with his best friend. I wouldn't want to jeopardize losing those files. He left behind two external HDDs, which I was pretty certain contained all his important data, but I wasn’t willing to risk it by wiping the internals until I double-checked myself (I couldn’t reach him at the time bc it was late though I shot him a message).
I followed a few common troubleshooting steps — booting into safe mode, various options in recovery mode, disk-repair, resetting NVRAM/PRAM, single-user mode, and even tried reinstalling the OS — none of that worked. I was fairly certain that the operating system had been corrupted beyond self-repair after running various commands in the terminal resulted in “cannot find this library”-type errors. On the other hand I was confident that the drive was still functional, not only because the disk-repair output seemed fine, but also because it wasn’t making any obvious noises that would point to HDD failure.
After torturing myself with this for far too long I realized I was being silly. I could just flash a live Ubuntu image onto a flash drive and boot into it on the tower, then mount the drive into my live Ubuntu environment. That way, I could explore the contents of the drive before performing a full-wipe.
After flashing the live Ubuntu image took nearly 20 minutes (it's a 6GB ISO), I decided to call it a night. In the morning I woke up to a text from my dad. He confirmed my suspicion that the OS install on the internal drive was corrupted, his OS was installed on one of the two external HDDs. He also told me that he has backups of all important files that he yanked off the externals a while ago, so I was clear to wipe everything.
But ... I realized there was another movie project he worked on. A documentary about my grandfather, for his 70th birthday. I asked him once more if that was included in his backups he has on hand. He wasn't sure. So I had to check the contents of all the drives I had before wiping anything. I was back to my original plan: I booted into the live Ubuntu environment the next morning.
Within the Ubuntu live environment there are a few things we need to do in order to inspect the contents of the internal HDD and the external HDDs. For now, I can only connect one external HDD at a time (starting with the G-Drive), since I only have one FireWire800 cable. Additionally, I have no idea where the power cable for the WD drive is, so that will have to come later. We'll set aside the WD drive for now and only focus on the 4TB external and the corrupted internal.
First thing I like to do when preparing to mount drives is to run lsblk to see what I'm working with and how the drives are laid out:
ubuntu@ubuntu:~$ lsblk
#------------------------------------------
sda 8:0 0 596.2G 0 disk # <-- This is the Internal Drive
├─sda1 8:1 0 200M 0 part
├─sda2 8:2 0 595.4G 0 part
├─sda3 8:3 0 619.9M 0 part
sdb 8:16 0 3.6T 0 disk # <-- This is the External Drive
├─sdb1 8:17 0 200M 0 part
├─sdb2 8:18 0 3.6T 0 part
├─sdb3 8:18 0 619.9M 0 part
sdc 8:32 1 14.9G 0 disk # <-- This is the Live USB
├─sdc1 8:17 1 5.9G 0 part /cdrom
├─sdc2 8:18 1 5M 0 part
├─sdc3 8:18 1 300K 0 part
├─sdc4 8:18 1 9G 0 part /var/crash
#------------------------------------------
This tells me the internal drive is /dev/sda (I know it's the ~600GB drive), the external drive is /dev/sdb (I know it's ~3.6TB), and the live USB I'm booted into is /dev/sdc (mounted at /cdrom). Next I can mount both the internal and external drives to inspect their contents, noting which partitions I need to mount (the root partitions) -- these are /dev/sda2 and /dev/sdb2 respectively. I create mount points for each drive and mount them:
sudo mkdir /mnt/{internal,external}
sudo mount /dev/sda2 /mnt/internal
sudo mount /dev/sdb2 /mnt/externalThen, in the Ubuntu file explorer, I can manually navigate to /mnt/external and /mnt/internal to inspect the contents of each drive. Going through /mnt/external I was quickly able to find the documentary about my grandfather! I also found other photos and stuff that seemed important to my dad.
There was a few ways I could go about backing up the data. I could:
Buy a large enough external HDD to copy all the important files onto. (Cost: ~$100+)
Buy one USB-B to USB-C cable, one USB-3.0 to USB-C cable, and one SATA to USB-C cable, then connect both externals and the internal to my laptop and copy the files over that way. (Cost: ~$60)
Connect the Mac tower to my home network via Ethernet (already done), expose an SSH server on the live Ubuntu environment, and then copy the files over the network to my laptop. (Cost: $0)
Tip: Always Consider All Possible Paths!
When trying to solve a problem, it's always good to brainstorm all possible solutions before committing to one. This way you can weigh the pros and cons of each method and pick the one that best suits your needs. In this case, it's pretty clear that option 3 is the way to go. You don't have to go buy anything when good ol' SSH and SCP can get the job done. It's the most elegant solution by far.
First step is to install and start the SSH server:
sudo apt install openssh-server
sudo systemctl start ssh
# we set a password for the `ubuntu` user so we can log in remotely
sudo passwd ubuntu # live sessions have no password by defaultNow before we connect from our laptop we need to know the IP address of the Mac tower. Thus we run:
ifconfig # Make note of the Mac Tower's private IP addressNow that we have an SSH server running on the Mac tower, and we know its IP address, we can use scp from our laptop to remotely copy the important files over the network.
scp -r ubuntu@<ubuntu-local-ip>:/mnt/external/path/to/files \
/path/to/local/destinationOh something I forgot to mention is that my main display (and only in-use monitor at the moment) is a 3440x1440 ultrawide. The Mac does not like this. My guess is due to how old the GPU/GPU firmware is, but it caused a lot of annoyance when the picture would render halfway off the screen and I couldn't see what I was doing. Even adjusting the aspect-ratio on my monitor from "Full-wide" to "Original" didn't help. If I was installing Arch I could blind-type my way through the install but I'm not familiar enough with Ubuntu to reliably do this. I unearthed an old Asus 24" monitor I used to game on and hooked it up. This worked much better, but I had removed the base of the monitor when transporting it and I have no idea where it ended up, so I have it propped up against the wall :P
Pretty janky, but it works. I'll only need it for a bit longer now. Now that I'm done copying the important files off the external drive, my next step is to wipe the internal and external drive, install Ubuntu, and setup the NAS functionality.
Once Ubuntu is installed, one of the first things I'll do is set up SSH so I can remote into it from my laptop and do all the setup/administration headless. But this'll have to wait until our next post. See you then!
<100 subscribers

I think I’m just a man with expensive hobbies because two years I ago I got really into skiing — finally cashing out on my own gear and a season pass — and this year I’ve decided to start homelabbing.
And no, I don't mean the type of homelabbing Walter White was doing, I mean more the kind that Lain does:

A homelab is a personal setup of computers and networking devices that are owned and operated entirely by yourself! They can be used to provide services or conveniences on your home network, such as: Network Attached Storage (NAS), a Network-wide Adblocker, personal Streaming Services, and much, much more. Basically like doing sysadmin work — not for a paycheck — but for the sole satisfaction of relying less on expensive cloud platforms and corporate-owned services and more on your own technical skills and hardware.
There are many small projects you can build to start homelabbing, but a personal NAS server is one of the most common places to start. It's like having your very own Google Drive, one that you control entirely. The only storage limit you'll run into is governed by the hardware you have on hand. And the best part is that your files will no longer live in the "cloud" -- fancy term for someone else's computer -- they'll live on your drives and served by your server.
First step of any homelab setup is hardware. No need to break the bank here, just start with whatever you’ve got laying around or source some for cheap (Dell Micro PCs are a great option).
I’m currently living in a place where my Dad used to live, and he left a 2010 Mac Pro 5,1 in the closet. It’s been collecting dust for years. I wanted to start my homelab with a smaller footprint and use something like a Dell Optiplex to run my NAS — but I don’t have one of those laying around — I have this huge cheese grater!

But don’t let its size fool you! Although this thing used to be a workhorse back in its day, it’s sporting:
Intel Xeon W3520 8-core 2.793 GHz CPU
AMD ATI Radeon HD 4870 GPU
8GB DDR3 RAM
This won't be enough compute to do as much as I want, but here's why I find it an enticing machine to start with (aside from the fact that I have it at my disposal).
Modularity: These old Mac Pro towers are extremely modular. It was the last line of computers Apple built that were truly for the computer hobbyist. They are extremely easy to work on and upgrade (often times requiring no screws since things just slot into place), and they have 4 hard drive bays! That's pretty good for a starter NAS setup.
Build Quality: It's Apple. Need I say more?
But of course, with a machine this old, it is not without it's drawbacks.
Dated Hardware: This machine is old. Really old. It's also the single-processor model (dual-processor models offer more robust upgrade paths). I could upgrade the processor to something like a Xeon W3690 (a 6-core, 3.46 GHz chip) and up to 46GB RAM, but that's not in my current plans. Maybe someday.
Power Draw: At idle this machine currently draws roughly 120-150W! That's a lot of power draw for something with such little compute! For reference, it's about 3x what a modern mini PC (like a Dell Optiplex) will draw. Here's some insane napkin math for ya (power is expensive where I live, roughly $0.45/kWh):
Assuming the machine is running 24/7:
Power | Cost | Timespan |
|---|---|---|
3.6 kWh | $1.62 | per Day |
25.2 kWh | $11.34 | per Week |
109.6 kWh | $49.31 | per Month |
1,315 kWh | $591.71 | per Year |
That's insanely expensive for a homelab server! A more modern, power-efficient machine (like an Optiplex) would be way more cost effective (and powerful) in the long-term. But hey,

Quick Note: I didn't compute this until after I had already been tinkering with this machine for a bit. Oops. Don't make the same mistake as me! If you live somewhere with expensive electricity, be sure to consider power draw when picking out homelab hardware.
For the time being, and since I realized this so late in my journey, I will still setup a basic NAS server on this machine to get my feet wet. But I won't be running this thing 24/7. Once I pick up an Optiplex I'll take what I've learned here and migrate my NAS over to that machine. Maybe someday I'll upgrade this tower and use it for something else fun. I wonder if any of the upgrade paths available for this model could lessen the power draw?
As it currently stands, this stock Mac Pro 5,1 is not in my long term plans for my homelab. It doesn't offer enough compute for how much it draws in power. However it can serve as a good starting machine to get my hands dirty. My current plan is to set it up as a NAS server so I can get familiar with administering tech like Samba shares or NFS. Once I pick up something like an Optiplex, or hell, even a Raspberry Pi, I'll migrate the server over to a new machine and decommission the tower — where it will return to collect dust once more. More seriously though, we'll likely keep it around as long as we have the space to store it. If I find myself with money burning a hole in my pocket, it would sure be fun to substantially upgrade it.
Thus, my journey to build the MacNAS began ... with a rough start. The system booted up okay, but wouldn’t successfully log in to any user accounts. It would simply load and load and keep on loading, never dropping me into a desktop environment. I wanted the satisfaction of interacting with an ancient OSX desktop, and opening the “About” page to see the system’s specs. More importantly though, my first mission was to ensure no important data still resided on the machine’s internal HDD. When my dad was using this, he was working on movies he made with his best friend. I wouldn't want to jeopardize losing those files. He left behind two external HDDs, which I was pretty certain contained all his important data, but I wasn’t willing to risk it by wiping the internals until I double-checked myself (I couldn’t reach him at the time bc it was late though I shot him a message).
I followed a few common troubleshooting steps — booting into safe mode, various options in recovery mode, disk-repair, resetting NVRAM/PRAM, single-user mode, and even tried reinstalling the OS — none of that worked. I was fairly certain that the operating system had been corrupted beyond self-repair after running various commands in the terminal resulted in “cannot find this library”-type errors. On the other hand I was confident that the drive was still functional, not only because the disk-repair output seemed fine, but also because it wasn’t making any obvious noises that would point to HDD failure.
After torturing myself with this for far too long I realized I was being silly. I could just flash a live Ubuntu image onto a flash drive and boot into it on the tower, then mount the drive into my live Ubuntu environment. That way, I could explore the contents of the drive before performing a full-wipe.
After flashing the live Ubuntu image took nearly 20 minutes (it's a 6GB ISO), I decided to call it a night. In the morning I woke up to a text from my dad. He confirmed my suspicion that the OS install on the internal drive was corrupted, his OS was installed on one of the two external HDDs. He also told me that he has backups of all important files that he yanked off the externals a while ago, so I was clear to wipe everything.
But ... I realized there was another movie project he worked on. A documentary about my grandfather, for his 70th birthday. I asked him once more if that was included in his backups he has on hand. He wasn't sure. So I had to check the contents of all the drives I had before wiping anything. I was back to my original plan: I booted into the live Ubuntu environment the next morning.
Within the Ubuntu live environment there are a few things we need to do in order to inspect the contents of the internal HDD and the external HDDs. For now, I can only connect one external HDD at a time (starting with the G-Drive), since I only have one FireWire800 cable. Additionally, I have no idea where the power cable for the WD drive is, so that will have to come later. We'll set aside the WD drive for now and only focus on the 4TB external and the corrupted internal.
First thing I like to do when preparing to mount drives is to run lsblk to see what I'm working with and how the drives are laid out:
ubuntu@ubuntu:~$ lsblk
#------------------------------------------
sda 8:0 0 596.2G 0 disk # <-- This is the Internal Drive
├─sda1 8:1 0 200M 0 part
├─sda2 8:2 0 595.4G 0 part
├─sda3 8:3 0 619.9M 0 part
sdb 8:16 0 3.6T 0 disk # <-- This is the External Drive
├─sdb1 8:17 0 200M 0 part
├─sdb2 8:18 0 3.6T 0 part
├─sdb3 8:18 0 619.9M 0 part
sdc 8:32 1 14.9G 0 disk # <-- This is the Live USB
├─sdc1 8:17 1 5.9G 0 part /cdrom
├─sdc2 8:18 1 5M 0 part
├─sdc3 8:18 1 300K 0 part
├─sdc4 8:18 1 9G 0 part /var/crash
#------------------------------------------
This tells me the internal drive is /dev/sda (I know it's the ~600GB drive), the external drive is /dev/sdb (I know it's ~3.6TB), and the live USB I'm booted into is /dev/sdc (mounted at /cdrom). Next I can mount both the internal and external drives to inspect their contents, noting which partitions I need to mount (the root partitions) -- these are /dev/sda2 and /dev/sdb2 respectively. I create mount points for each drive and mount them:
sudo mkdir /mnt/{internal,external}
sudo mount /dev/sda2 /mnt/internal
sudo mount /dev/sdb2 /mnt/externalThen, in the Ubuntu file explorer, I can manually navigate to /mnt/external and /mnt/internal to inspect the contents of each drive. Going through /mnt/external I was quickly able to find the documentary about my grandfather! I also found other photos and stuff that seemed important to my dad.
There was a few ways I could go about backing up the data. I could:
Buy a large enough external HDD to copy all the important files onto. (Cost: ~$100+)
Buy one USB-B to USB-C cable, one USB-3.0 to USB-C cable, and one SATA to USB-C cable, then connect both externals and the internal to my laptop and copy the files over that way. (Cost: ~$60)
Connect the Mac tower to my home network via Ethernet (already done), expose an SSH server on the live Ubuntu environment, and then copy the files over the network to my laptop. (Cost: $0)
Tip: Always Consider All Possible Paths!
When trying to solve a problem, it's always good to brainstorm all possible solutions before committing to one. This way you can weigh the pros and cons of each method and pick the one that best suits your needs. In this case, it's pretty clear that option 3 is the way to go. You don't have to go buy anything when good ol' SSH and SCP can get the job done. It's the most elegant solution by far.
First step is to install and start the SSH server:
sudo apt install openssh-server
sudo systemctl start ssh
# we set a password for the `ubuntu` user so we can log in remotely
sudo passwd ubuntu # live sessions have no password by defaultNow before we connect from our laptop we need to know the IP address of the Mac tower. Thus we run:
ifconfig # Make note of the Mac Tower's private IP addressNow that we have an SSH server running on the Mac tower, and we know its IP address, we can use scp from our laptop to remotely copy the important files over the network.
scp -r ubuntu@<ubuntu-local-ip>:/mnt/external/path/to/files \
/path/to/local/destinationOh something I forgot to mention is that my main display (and only in-use monitor at the moment) is a 3440x1440 ultrawide. The Mac does not like this. My guess is due to how old the GPU/GPU firmware is, but it caused a lot of annoyance when the picture would render halfway off the screen and I couldn't see what I was doing. Even adjusting the aspect-ratio on my monitor from "Full-wide" to "Original" didn't help. If I was installing Arch I could blind-type my way through the install but I'm not familiar enough with Ubuntu to reliably do this. I unearthed an old Asus 24" monitor I used to game on and hooked it up. This worked much better, but I had removed the base of the monitor when transporting it and I have no idea where it ended up, so I have it propped up against the wall :P
Pretty janky, but it works. I'll only need it for a bit longer now. Now that I'm done copying the important files off the external drive, my next step is to wipe the internal and external drive, install Ubuntu, and setup the NAS functionality.
Once Ubuntu is installed, one of the first things I'll do is set up SSH so I can remote into it from my laptop and do all the setup/administration headless. But this'll have to wait until our next post. See you then!
Share Dialog
Share Dialog
No comments yet