Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
view the rest of the comments
Your use-case and situation seems very close to mine except I specifically do not host communities.
First of all, you can run as many services from single nginx as you want (or can handle), usually you do this by having each service on it's own (sub)domain and routing it all to the same IP, nginx then proxies the requests to the corresponding service running locally on a given port (see nginx reverse proxy).
I would definitely recommend docker images unless you have specific needs, afaik the ansible recipe installs and manages a docker compose project too (unless they also added official bare-bones ansible setup). Might be wrong here, I do docker and manage it myself, updating is usually a file edit and two commands away.
About the VPS being enough - from my monitoring, every foreign subscribed community increases the load, with bigger/more active communities increasing it more.
The main limiting resource for my setup is disk space, sometime ago I've calculated my database size is increasing about 1G per month with about 500 subscribed communities and that's only the postgresql database size without any media. The stats from my s3 provider (you can host images locally too), hint that I am gaining 1-5GBs of media per month.
I don't have any metrics how much the amount of active users drains the server as my instance is intentionally small, but I can imagine that having 10-100-1000 active users at the same time would drastically increase the load of at least postgres as well as increase the bandwith.
And about my setup for comparison, I am renting a dedicated server from Hetzner (AX41-NVMe) running a bunch of other services as well (minecraft server, factorio server, file sharing service, ...) and as of the last 30 days my monitoring reports the "average" load average (same for all 1/5/15m) being around 1 core (out of 12 core processor, 6*2 smt).
Memory is sitting at about 50% month average out of 64G.
Though, most of the services are really under-utilized (minecraft) or don't require much (factorio).
Rule of thumb, if your users subscribe to a lot of outside communities expect at least increased disk space consumption, at worst also increased bandwidth and load.
If any of your hosted communities get popular on the wider fediverse, definitely expect increased bandwith and load - more servers hitting your server with more data (upvotes, comments, edits...) means nginx, lemmy and postgres also need to process more.
At baseline there will be a lot of a spiky but small chatter from other instances and the biggest resource drain will be postgres.
I wouldn't personally go into this with anything less then 4 vCPUs, 32G of RAM and non-shared/virtual storage (disk latency kills postgres performance).
Disk space is definitely an issue, but I think I've got my single user instance dialed in on a 2 vcpu/4gb/30GB RAM Hetzner VPS; a cron job that runs at the first of every month deletes pictrs files over 30 days old. Currently at 74%.
A lot of bean memes died the day that job first ran.