

I’m quite new to docker for NAS stuff - how many pulls would the average person do? like, i don’t think i even have 10 containers 🤨
I’m quite new to docker for NAS stuff - how many pulls would the average person do? like, i don’t think i even have 10 containers 🤨
I don’t have much issue with email as a technology. It does what it needs to do, and does it well. The client side software is what hasn’t budged in years - Search barely works, files and attachments are cumbersome, and spam is still rampant.
It would be much cheaper and easier if users weren’t centralised under a few big providers that prefer to bar any and all access to said users if you’re self hosting, making it almost mandatory to use a private service.
Yeah it’s pretty much seamless. You just spin them up bare metal or docker (both are fine honestly) and follow any old tutorial for setup.
If using docker, ensure you mount the qbittorrent’s download folder to /config/Downloads
with a capital D or you’ll get a warning about paths being set up wrong.
Also, I assume this isn’t really an issue for you unless you mess with the downloads after the fact, but *arrs expect the torrented media inside a folder with the title of the media on it. It picks through torrent naming conventions fine, but when I migrated some movies yesterday I noticed it wouldnt pick up any video files that weren’t inside a directory.
Small note, the *arr stack (at least when running in docker) will prefer you mount qbittorrent’s download folder to /config/Downloads
(case sensitive). otherwise it whines about paths in the health menu
My current plan once new migration is completed:
Primary pool - 1x ZFS (couldn’t afford redundancy but no different to my RPI server). My goal is to get a few more drives and set up a RAIDZ1/2.
Weekly backup of critical data (eg. nextcloud) from primary pool to a secondary pool. Goal here is to get a mirror but will only be one drive for now.
Weekly upload of secondary pool to hetzner storage box via rsync.
Current server
1x backup to secondary drive (rpi) 1x backup to hetzner storage box via rsync
quicksync should let the i3 handle jellyfin just fine if you’re not going beyond 1080p for a couple of concurrent users. Especially if you configure the Nice values to prefer jellyfin over immich.
I’m not aware of the platform for the n300 because it might be worth the initial setup, and have some room to upgrade the CPU later if it causes trouble.
If OP is going for multiple systems, I’d definitely agree on making one of them a pure NAS and let a more upgradable system run the chunky stuff.
you can, but it’ll be the distilled versions that are significantly less impressive.
https://apxml.com/posts/gpu-requirements-deepseek-r1
Newer versions also introduce memory mapping so they’re technically only bound by your storage capacity (but they’re way slower in practice).
my laziness does though! ill keep that service in mind though :)
I spent far too much on my domain (£3.86 for the year) to change course now!
i use nginx proxy manager but im barely getting by. Theres zero useful documentation for setting up custom paths so everyone uses subdomains. I ended up buying my own domain just so i didnt feel guilty spamming freedns lmao.
Ive got a basic workflow for nginx proxy manager now so this isnt super useful but good god that’s exactly what i wish nginx was.
Docker is a bit of a pain to learn but it really helps once you start figuring things out!
Anyone know how to set up NPM on truenas scale? I’ve spent all day trying to get my SSL certs and it fails every damn time. Just says the donation is unknown or that it can’t find my npm install 😮💨
I’m using a freedns domain tho so maybe I’m gonna need to try buying a domain.
IMO you should stick with a local device store only. If you’re worried about the state getting hold of the data, having any backups is gonna be a liability.
still learning truenas. i think I’ve figured out nextcloud which is basically a nightmare whenever I’ve had to install it
Yeah both Nginx and plex handle making themselves public for me already. But I have a handful of other svcs that id like to move behind a reverse proxy too
Yeah port forwarding just isnt the same. I pretty heavily rely on Nextcloud and Plex doing the port forwarding for me
Yeah I’ve given some recommendations but it’s really good to just start small and pick up new stuff as you go, then you can identify your needs and do a big upgrade.
Prowlarr, Sonarr, Radarr.
These services let you find, download, and manage tv shows/movies from multiple trackers. You can even start tracking a tv show that’s still running and it’ll download new seasons as and when they’re released. From there they’re forwarded to your torrent client.
It’s awesome, lets my non-technical GF add movies and tv shows without me, and means we’re up to date on severance!
I’d personally recommend a second hard drive of 500GB at least. You’ll quickly fill that 250gb drive, and it’s good practice to keep your data and applications separate (if the drive fails or gets upgraded your services won’t need to go down!). You can also set up a ZFS pool so you can add drives later into a big pool that’s treated like a single drive by your applications, though most of those services can support multiple storage locations so ZFS isnt too urgent if you expand to a new drive.
I can personally attest that the SU630 is a good SSD though. Serves my raspberry pi well! You don’t need SSDs for your bulk storage though, you won’t need the speed.
I’ve kept a raspberry pi 4b that’s given a mild OC to 1900Mhz in my boiler cupboard for a year and all its needed to keep it below 50 is: