A server to rule them all
I was talking in my preface article about the approach I initiated to make my home a modest laboratory of an adapted habitat, simple and intelligent. Let’s start this article by the brain of the system: the home server.
The home server answers two needs I had: to be more or less autonomous and sovereign of my data against the GAFAM and to efficiently control my home.
Before starting, I am not a sysadmin and I don’t claim to master all the usages and rules of cybersecurity and system administration. I invite you to question the choices I might have made, and don’t hesitate to contact me if you find an error or mistake. This article is addressed to those who have no notions, which explains the shortcuts and simplifications.
The initial need, the reason for being of the server, was to host Home Assistant and a cloud storage service. Thus, I firstly oriented myself towards a Raspberry Pi 5 with 4Go of RAM, but in order to host other services, I finally chose the 16Go RAM version.
Let’s talk about hardware
For those who are not familiar with them, Raspberry Pis are single-board computers containing a processor, soldered RAM, a WiFi and Bluetooth card, various interfaces (depending on the versions: USB, PCIe, Ethernet…). The storage is often entrusted to an SD card.
However, SD cards are not reliable enough to serve as storage media because they have a limited number of write cycles. Some services that write a lot to the disk (logs, data…) significantly reduce their lifespan. Using an SSD becomes thus crucial for a home server.
I thus bought an M.2 hat to connect to the PCIe port of the Raspberry Pi and was able to recycle a 512Go SSD from my old laptop. I also bought a metal case with a fan for cooling adapted to the size of the hat.
Software side
I then flashed the SSD with Raspberry Pi OS Lite, a Debian-derived operating system in its version without a graphical interface (only the console).
I wanted to host several services, isolated from each other for security, that can be easily added, started, stopped and deleted. I therefore chose to use Docker for the simplicity of Compose. Compose is a now native system allowing to create and manage a set of services using .yml files.
Network architecture
I wanted to mask the IP of my home and allow installing my server anywhere without having to redefine the DNS of my domain name. I tested the Cloudflare tunnel, but I discarded it for reasons of autonomy, sovereignty and respect of privacy. My choice was finally made towards a VPS at 1€/month redirecting the traffic.
In detail, the DNS of my domain name point to the IP of the VPS, the only visible from the outside. Then, a reverse proxy (Caddy) manages the TLS layer (the S in HTTPS) and redirects the packets to the Raspberry Pi via a Wireguard VPN. Finally, another reverse proxy, Traefik, distributes the traffic to the Docker services. For non-HTTP protocols (TCP/UDP), a simple port redirection (port forwarding) is made.
Services
The three first services hosted were those at the origin of the need: control and monitoring of the house, cloud drive and photo synchronization.
Home automation
Home Assistant is the main open source home automation manager. It supports almost all protocols and brands of connected objects. It allows controlling objects (lights, heaters, fans…), monitoring via sensors (light, presence, air quality…) and automating.
The main dashboard displays thus weather information, public transport status via a service I developed in Go, energy consumption, status and controls of my connected objects and information about the status of the Home Server.
It is however heavy and resource hungry (500-600Mo of RAM for me).
Cloud storage (files and photos)
Then, after testing Seafile (discarded for its feature limitations) and ownCloud, I finally opted for Nextcloud. I am not fully satisfied with it as it is developed in PHP and has too many features.
For photo synchronization, I had initially chosen ente, seduced by end-to-end encryption, but this also made me dependent on the service. So I chose Immich, very similar to Google Photo, whose interface is very ergonomic and has many features (face recognition, location map, smart search…). The machine learning operations requiring important resources, I localized it on my PC (equipped with an Nvidia GTX 1650 graphics card) by connecting it to the VPN.
Authentication
I then added Authelia to centralize authentication via OIDC with support for time-based codes (TOTP) and security keys (Webauthn). This protocol is used by most services, except for some freemium services (non or partially open-source with a paid version that allows it) and Home Assistant whose support is too patchy and patchy, I thus use internal authentication for this service.
All services
- AdventureLog: a logbook service to note down your travels and share them
- AnySync : synchronization of Anytype clients (encrypted and local alternative to Notion)
- Authelia : centralization of authentication via OIDC
- Autorestic : encrypted incremental backups
- Beszel : monitoring of resources used by services on the servers
- Gramps : genealogical tree
- Grist : spreadsheet and database
- Home Assistant : control and monitoring of the house
- Immich : synchronization of photos and videos
- Music Assistant : music management
- n8n : automation, workflow and integration
- Nextcloud : cloud file storage
- Ntfy : push notifications
- Paperless : document management
- Plausible : analytics and statistics
- Portainer : container management interface
- Sablier : automatic container shutdown
- Seaweedfs : S3 storage
- Sure : financial management and personal budgeting
- Traefik : reverse proxy
- Uptime Kuma : monitoring of the availability of services
- Watchtower : automatic updates of containers
Backups
Backups are crucial for a home server, especially if it hosts important data. I have configured Autorestic to make daily backups. The advantage of Restic (on which Autorestic is based) is that it only makes encrypted incremental backups, meaning it only backs up files that have been modified since the last backup, and the data is encrypted before being sent to the destination storage.
The recommended strategy is the 3-2-1 rule: three copies of the data, on two different media, with one off-site (not in the same building for example). I thus have backups on different external hard drives, stored in different locations. In the future, I will also make backups on online archiving services, particularly in Cold Storage (long-term storage, less expensive than traditional storage services) for better resilience.