lemmyvore

joined 1 year ago
[–] lemmyvore@feddit.nl 1 points 5 hours ago (1 children)
[–] lemmyvore@feddit.nl 5 points 10 hours ago

If you don't already know the benefits it's unlikely it solves a problem you have.

Even among its users many are using it because it's cool rather than because they actually need it.

It's a declarative system, meaning you can describe how it should be setup (using a magic strings you have to look up online) and then it "sets up itself" according to the description.

It's normally something you'd use for mass and/or repetitive deployments.

It's usefulness for a single system is debatable, considering you can achieve very close to 100% of "reproducibility" anyway by copying /home and /etc and fetching a copy of the package list.

Where the prescriptive approach is supposed to help is when you attempt to reproduce the system a long time later, after things like config files and packages have changed. But it doesn't help with /home, it hasn't been tested over long intervals, and in fact nobody guarantees long term compatibility for Nix state.

[–] lemmyvore@feddit.nl 2 points 10 hours ago

If someone gets access they can delete your keys, or set up something that can intercept your keys in other ways.

The security of data at rest is just one piece of the puzzle. In many systems the access to the data is considered much more important than whether the data itself is encrypted in one particular scenario.

[–] lemmyvore@feddit.nl 3 points 11 hours ago (2 children)

Now replace "signal" in your comment with "ssh" and think it over.

[–] lemmyvore@feddit.nl 2 points 11 hours ago (1 children)

If any part of the data gets corrupted you lose the whole thing. Recovery tools can't work with partially corrupted encrypted data.

[–] lemmyvore@feddit.nl 29 points 11 hours ago (8 children)

You. Don't. Store. Secrets. In. Plaintext.

SSH stores the secret keys in plaintext too. In a home dir accessible only by the owning user.

I won't speak about Windows but on Linux and other Unix systems the presumption is that if your home dir is compromised you're fucked anyway. Effort should be spent on actually protecting access to the home personal files not on security theater.

[–] lemmyvore@feddit.nl 3 points 12 hours ago

Oh no, tell that to SSH.

[–] lemmyvore@feddit.nl 4 points 1 day ago (1 children)

The AUR is basically just a script that describes best case scenario for building something under Arch. They don't have any specific quality rules they have to meet.

It's super easy to make and publish an AUR script compared to a regular distro package (including Arch packages).

[–] lemmyvore@feddit.nl 3 points 1 day ago

The ads are added in the app. If you cast, the Chromecast can't add apps (yet) so they'd have to make ad streams instead, and switch between the streams show-ad-show which would take several seconds of loading screen each way and so on. Which is a level of fuckery even they shied away from.

TLDR they can't (easily) show ads during casting.

[–] lemmyvore@feddit.nl 1 points 1 day ago

No, unfortunately. 🙂 They're not talking about secure paths inside the office, they're taking about having all their processes using the floppy disks. They have a common format which is proprietary and can only be made with a proprietary PC program and can only be submitted on floppy disks.

[–] lemmyvore@feddit.nl 4 points 1 day ago

I want to know why Mozilla won't add it to Firefox Focus.

[–] lemmyvore@feddit.nl 1 points 1 day ago (2 children)

Sure, but forcing an entire country to go through physical handling of each and every request is crazy. I really don't understand how they managed for so long.

 

I took some photos at an event and I need to go through them and get rid of the bad ones (eyes closed, things in the shot, out of focus, blurred etc.) I'm not a pro photographer so no idea where to begin with photo apps. I've used RawTherapee and Gimp a bit.

What app will let me quickly browse the photos and handle (delete/tag) photo formats together (both the RAW and the JPG)?

 

I'm posting this in selfhosted because Gandi increasing prices actually helped me a lot with being more serious about selfhosting, made me look into things like DNS and reverse proxies and VPN and docker and also ended up saving me money by re-evaluating my service needs.

For background, Gandi.net is a large and old (25 years) domain registrar and hosting provider in the EU, who after two successive rounds of being acquired by investment funds have hiked up prices across the board for all their services.

In July 2023 when they announced the changes for November I was using their services for pretty much everything because I manage domains for friends and family. That means a wide selection of domains registered with them (both TLDs and European ccTLDs), LAMP hosting, and was taking advantage of their free email hosting for multiple domains.

For the record I don't hold the price hike against them, it was just unsustainable for us. Their email prices (~5€/mailbox/mo) are in line with market prices and so are hosting prices. Their domain prices are however exaggerated (€25-30/yr is their lower price now). I also think they could've been smarter about email, they could've offered lower prices if you keep domains registered with them. [These prices include the VAT for my country btw. They will appear lower in USD.]

What I did:

Domains: looked into alternative registrars with decent prices, support for all the ccTLDs I needed, DNSSEC, enforced whois privacy, and representative services (some ccTLDs require a local contact). Went with INWX.com (Germany) and Netim.com (France). Saved about €70/yr. Could have saved more for .org/.net/.com domains with an American registrar but didn't want to spread too thin.

DNS: learned to use a dedicated DNS service, especially now that I was using multiple registrars since I didn't want to manage DNS in multiple places. Wanted something with support for DNSSEC and API. Went with deSEC.io (Germany) as main service and Bunny.net (Slovenia) as backup. deSEC is free, more on Bunny pricing below. Learned a lot about DNS in the process.

Email: having multiple low-volume mailboxes forced me to look into volume-based providers who charge for storage and emails sent/received not mailboxes. I've found Migadu (Swiss with servers in France at OVH), MXRoute (self-hosted in Texas) and PurelyMail (don't know). Fair warning, they're all 1-2 man operations. But their prices are amazing because you pay a flat fee per year and can have any number of domains and mailboxes instead of monthly fees for one mailbox at one domain. Saved €130/yr. Learned a lot about MX records and SPF/DKIM/DMARC.

Hosting: had a revelation that none of the webpages I was hosting actually needed live dynamic services (like PHP and MySQL). Those that were using a CMS like WordPress or PHP photo galleries could be self-hosted in docker containers because only one person was using each, and the static output hosted on a CDN. Enter Bunny.net, who also offer CDN and static storage services. For Europe and North America it costs 1 cent per GB with a $1 minimum/mo, so basically $12/yr since all websites are low traffic personal websites. Saved another €130/yr. Learned a lot about Docker, reverse proxies and self-hosting in general.

Keep in mind that I already had a decent PC for self-hosting, but at €330 saved per year I could've afforded buying a decent machine and some storage either way.

I think separating registrars, DNS, email and hosting was a good decision because it allows a lot of flexibility should any of them have any issues, price hikes etc.

It does complicate things if I should kick the bucket – compared to having everything in one place – which is something I'll have to consider. I've put together written details for now.

Any comments or questions are welcome. If there are others that have gone through similar migrations I'd be curious what you chose.

 

I'm thinking of putting all my email archive (55k messages, about 6 GB) on a private IMAP server but I'm wondering how to access it remotely when needed.

Obviously I'd need a webmail client but is there any that can deal with that amount of data and also be able to search through To, From, Subject and body efficiently?

I can also set up a standalone search engine of some sort (the messages are stored one per file in regular folders) but then how do I view the message once I locate it?

I can also expose the IMAP server itself and see if I can find a mobile app that fits the bill but I'd rather not do that. A webmail client would be much easier to reverse proxy and protect.

view more: next ›