

So, ChatGPT can’t match any function of a Casio wristwatch. I’m concerned that when it can, it will consume the power of microwaving a turkey just to tell a user what time it is.


So, ChatGPT can’t match any function of a Casio wristwatch. I’m concerned that when it can, it will consume the power of microwaving a turkey just to tell a user what time it is.


Ahh. That’s usually among the red stuff in dmesg. I glad to hear you solved it, but a failing hard drive is a pricey thing to endure these days.


Just start listening to dubstep and you’ll stop noticing 😆.
Maybe run lm-sensors and make sure the CPU/GPU isn’t being thermothrottled? I’d usually look at dmesg and look for red stuff. Any hardware issues are usually pretty obvious.
Try other apps. If you youtube or VLC behaves the same, the problem may be outside of jellyfin. If not, it narrows it down.
If could even be the server not being able to transcode in realtime. Try watching a file known to already be in a suitable format. It should direct stream and be much less load on the server. I’ve seen server encode CPU saturation and it does kinda look the same as client decode stutter. If it’s the server, you’ll probably see the same stutter from another device such as a phone.


Headless. The word you’re looking for is “headless”.
Some idiot who obviously wants stale xarray tags after writeback.


Mine would go years without changing. The last few changes were caused by things like the upstream DHCP server failing and being replaced.


Make it a subdomain on a wildcard cert if you’re concerned about that.


Just expose it on single-stack IPv6. Nobody ever knocks. The address space is not scannable.


Bah. Hans Reiser wrote filesystems all day and he turned out fine.
It went from INI to JSON real quick.


Example.com recently had an issue where its traffic was found being routed to the wrong place (its traffic should get discarded).
I use it for email accounts on test data in environments with a live mail server configured. The point of this domain is that it doesn’t work.


This is how I do it. No VPN. No NAT nonsense. You can open an IPv6 address to the public internet and nobody is going to stumble across it. You don’t even disclose your address to servers you connect to.
100% of shady connections come from bots scanning address space on IPv4.


I don’t get how a single person would have that much data. I fit my whole life from the first shot I took on a digital camera in 2001… Onto a 4TB drive.
…and even then, two thirds of it is just pirated movies.


I deliberately have not used docker at home to avoid complications. Almost every program is in a debian/apt repo, and I only install frontends that run on LAMP. I think I only have 2 or 3 apps that require manual maintenance (apart from running “apt upgrade”). NextCloud is 90% of the butthurt.
I’m starting to turn off services on IPv4 to reduce the network maintenance overhead.


Checkout followed by 400 build errors because your entire toolchain and build pipeline has changed since you last touched it.


Optus is barely an internet connection at this point. I’m using about 10 fearures on Aussie Broadband that simply don’t exist on the Optus network.


Telstra (Australia’s largest telco) now provides IPv6-only to mobile handsets by default. They’ve deployed 464XLAT.


The main benenfit is not having to deal with NAT. You get your own address and your traffic is not conflated with other people’s.
You also get privacy extensions. Your device generates a temporary address for making outgoing connections. The address has no listening sockets. This means that you cannot get portscanned by every website you visit.
You don’t need to try and figure out your external IP address. There’s no differentiation between internal/external addresses. They’re all global, as the internet was intended.
You can throw as many IP addresses on an interface as you want. If you want to run two web servers from one machine, you can have multiple addresses with different services on port 443.
Enforcing TLS filters out a lot of spam connectikns too. Every legit provider has a cert these days.
I wonder if anyone ever wrote an update aggregator that would find all package managers, containers and git repos and whatnot and just do all of them.
Some are a right pain to update, such as Nextcloud. Installing a monthly update should not feel like an enterprise prod deployment.
It’s kinda ironic that package managers have caused the exact problem that they are supposed to solve.