

Absolutely, then people go and delete the other copies leaving just the cloud, and think that it’s somehow fine.


Absolutely, then people go and delete the other copies leaving just the cloud, and think that it’s somehow fine.


Probably worth storing the key in another place as well, like keepass on your phone or just print it out on paper and store it.


Not much you can do, if it’s on the internet it is public.
You can block some scrapers with PoW and that sort of thing, but you’ll never block all of them.


Well that was horrifying, a bit much.


I wouldn’t be exposing any management consoles to the internet either way, too much risk with something that has docker socket access.


Komodo is the best portainer alt I’ve found, I read through the Arcane info but it doesnt seem as good. Komodos editor also works great.


My favorite is ‘fast and lightweight’ followed by ‘RAM required >500MB’ for a some kind of basic server.


If you want automatic updates over major versions most images will have the :latest tag for that.
It doesnt actually bypass the firewall.
When you tell docker to expose a port on 0.0.0.0 its just doing what you ask of it.


I can’t imagine we currently produce enough electricity for every car to be electric.
Plus all the production processes for the cars themselves, and the energy to power them puts off waste heat. Even solar panels benefit from running cooler by having heat removed from them.


It feels like Bazzite tells you a million times over that you absolutely should not layer packages, it scared me off for sure since I’m new to immutable systems and don’t really know how they work fully.


Depends what protocols you need?
If you use SMB install the Samba server package. If you use WebDAV install a WebDAV server like SFTPGo, etc…
If you want a google drive like replacement there’s Nextcloud, Owncloud, Seafile, and others.
For the drives themselves you can have traditional RAID with MD, or ZFS for more reliability and neat features, or go with MergerFS + SnapRAID, or just directly mount the disks and store files on some and backup to the others with Restic or something.
Lots of options!
Yeah I guess these days the majority of users have fast enough connections that its not worth it. It sucks if you have crappy internet though hah.


Thats how I describe Jellyfin, it works fine, its just inconvenient to use.
Interesting, it wouldn’t work like rsync where it compares the new files to the old ones and transfers the parts that have changed?
Download of 6GB is wild, is that re-downloading the entire package for each one that needs an update? Shouldn’t it be more efficient to download only the changes and patch the existing files?
At this point it seems like my desktop Linux install needs as much space and bandwidth than windows does.


This one is clearly AI, there’s no PCIe card edge on the lower GPU, the fan blade width and spacing isn’t consistent, and the phone camera lens borders have artifacting.


It doesnt graph over time really, it only does it while open and loses the data if you close it.


Here’s an actual answer, a system monitor with historical data: https://beszel.dev/
It’s a webUI but that shouldn’t really matter vs an app with its own GUI.
Yeah stuff like that, but also the locally synced copy I would not trust no matter what as really any sync software can suddenly delete or corrupt files. Best to have at least 2 actual backups in place that are versioned and done daily or every few hours.