Dozzle, log forge is a new one I’ve seen but not tried.
Dozzle, log forge is a new one I’ve seen but not tried.
Mint or Ubuntu is like Windows but better.
The metadata server isn’t updated and hasn’t for a long time so it stops after a point in time and certain authors with lots of books just don’t come up.
Given the readarr team are looking for maintainers to take over, I doubt it’ll be resolved soon.
https://www.reddit.com/r/selfhosted/comments/16d7gyo/since_readarr_seems_down_for_the_count_is/
I’d suggest integrate with LazyLibrarian over Readarr as it’s metadata and ability to add recent books is borked probably forever.
I’d rather not have them probing my website at all. I’m not Facebook, my data is not unlimited and free.
I’m trying to block the most likely attack vectors which is definitely VPS providers at this point in time. I just figure if I am blocking subnets plus additionals I identify it will force them out of these vectors to attack in ways I might be able to report better abuse.
Here check out my analysis.
I didn’t think OP was going the ZFS route so it wouldn’t matter on that point.
His Server 2 will be running on the red line imho so any overhead would have impact.
quicksync should let the i3 handle jellyfin just fine if you’re not going beyond 1080p for a couple of concurrent users. Especially if you configure the Nice values to prefer jellyfin over immich.
Most of my content is 4K h264. You may be right on the 1080 but I don’t have content at that resolution generally.
Worst case scenario he can always keep the N300 for other stuff if it doesn’t work out.
Personally I would keep it simple and just run a separate NAS and run all your services in containers across the devices best suited to them. The i3 is not going to manage for Jellyfin while sharing those other services. I tried running it on an N100 and had to move it to a beefier machine(i5). Immich for example will use a lot of resources when peforming operations, just a warning.
If you mount a NAS storage for hosting the container data, you can move them between machines with minimal issues. Just make sure you run services using a docker-compose for them and keep them on the NAS.
You completely negate the need for VMs and their overhead, can still snapshot the machine if you run debian as the OS there is timeshift. Other distros have similar.
Not all heroes wear macros.
I’ve done BYO TrueNas and unraid and eventually went for a pair of 8 Bay Synology NAS for bulletproof hardware and ootb working backups, replication etc.
I run containers on machines that also use the NFS storage supplied by them.
Too much dialogue is pretty much the point of a space opera like Dune.
Given the time the Tleilax were still organ farmers and using industrial scale cloning to supply.
They could have been doing all sorts out of sight. They would still be illicitly using machines, axolotl tanks were a surrogate for machines.
After Wheel of Time I was skeptical but it shows promise.
I use plane for my personal project ticketing and git is built into all my IDE. Why make it harder because of a random Foss project with opinions.
Are you sure you’re not getting butt hurt by valid critiques?
We do excel roughly but invest our surplus.
I have a bunch of we scrapers that check for items on sale and for certain ones trigger purchase and others send me an alert.
The one upside is if the game flops it gives lots of users a chance to not buy it. But any game with multiplayer it effectively kills the MP mode.
MW5 and Satisfactory amongst others were limited to Epic for years.
The backup is a self hosted splunk.