

No, not on porpoise.
🅸 🅰🅼 🆃🅷🅴 🅻🅰🆆.
𝕽𝖚𝖆𝖎𝖉𝖍𝖗𝖎𝖌𝖍 𝖋𝖊𝖆𝖙𝖍𝖊𝖗𝖘𝖙𝖔𝖓𝖊𝖍𝖆𝖚𝖌𝖍


No, not on porpoise.


I have no idea! It seems to be the human material. Have you ever heard of a solution? I can be aware of it and resist it, but what I hate is that instinctive, negative impulse, and I don’t think wishing it away is going to help.


Ok, so preface: this isn’t about you. Your comment just coalesced something I’ve been ruminating about recently.
I wish we, as humans, didn’t have this knee-jerk tenancy to make everything a zero-sum competition. Vi vs EMACS. x86 vs ARM. Windows vs Mac vs Linux vs FreeBSD. C vs Go vs Rust vs Clojure vs JavaScript. Arch vs the world.
It really is a zero-sum game, with real consequences. If your favorite distro becomes unpopular enough, it might die, and then you have to give up something you love. Windows winning the OS market for decades meant countless people had to suffer using Windows because the company they worked for mandated it. If I crusade for V(lang) enough, it might become popular enough for jobs to open for it.
The downside is that we’re constantly fighting against diversity, and that’s bad.
I suffer from this as much as anyone, and I hate that my first impulse is to either tear down “the opposition”, which at some point is nearly everyone, or schadenfreude.
“It is not enough that I succeed, but that others should fail.” It can’t be healthy.


I miss the days when every package came with a man page.
Every respectable package; don’t come at me, pendants.


groan
My recommendation is to put all of the variables in an environment file, and use systemd’s EnvironmentFile (in [] to point to it.
One of my backup service files (I back up to disks and cloud) looks like this:
[Unit]
Description=Backup to MyUsbDrive
Requires=media-MyUsbDrive.mount
After=media-MyUsbDrive.mount
[Service]
EnvironmentFile=/etc/backup/environment
Type=simple
ExecStart=/usr/bin/restic backup --tag=prefailure-2 --files-from ${FILES} --exclude-file ${EXCLUDES} --one-file-system
[Install]
WantedBy=multi-user.timer
FILES is a file containing files and directories to be backed up, and is defined in the environment file; so is EXCLUDES, but you could simply point restic at the directory you want to back up instead.
My environment file looks essentially like
RESTIC_REPOSITORY=/mnt/MyUsbDrive/backup
RESTIC_PASSWORD=blahblahblah
KEEP_DAILY=7
KEEP_MONTHLY=3
KEEP_YEARLY=2
EXCLUDES=/etc/backup/excludes
FILES=/etc/backup/files
If you’re having trouble, start by looking at how you’re passing in the password, and whether it’s quoted properly. It’s been a couple of years since I had this issue, but at one point I know I had spaces in a passphrase and had quoted the variable, and the quotes were getting passed in verbatim.
My VPS backups are more complex and get their passwords from a keystore, but for my desktop I keep it simple.


I hope this isn’t a step towards replacing the native app with an SPA.


It absolutely would be. It is, on the other hand, occasionly useful to be able to pop in and change a config file, many of which are actually Turing complete languages. What I do far more often, though, is SSH into remote, headless servers and write code there, which is exactly the same as doing it from a phone, only much more comfortable.



That’s the plain text editor Helix. In a terminal. Over ssh. On my phone. Which I can do because I’m not using a dumb IDE.


This seriously got an out-loud chuckle from me. It’s funny, because it’s true! Thanks!


There is negligible server overhead for a tarpit. It can be merely a script that listens on a socket and never replies, or it can reply with markov-generated html with a few characters a second, taking minutes to load a full page. This has almost no overhead. Implementation is adding a link to your page headers and running the script. It’s not exactly rocket science.
Which part of that is overhead, or difficult?


From the Anubis project:
The idea is that genuine people sending emails will have to do a small math problem that is expensive to compute,
“Expensive” in computing means “energy intensive,” but if you still challenge that, the same document later says
This is also how Bitcoin’s consensus algorithm works.
Which is exactly what I said in my first comment.
The design document states
Anubis uses a proof-of-work challenge to ensure that clients are using a modern browser and are able to calculate SHA-256 checksums.
This is the energy-wasting part of the algorithm. Furthermore,
the server can independently prove the token is valid.
The only purpose of the expensive calculation is so the server can verify that the client burned energy; the work done is useless outside of proving the client performed a certain amount of energy consuming work, and in particular there are other, more efficient ways of generating verifiable hashes which are not used because the whole point is to make the client incur a cost, in the form of electricity use, to generate the token.
At this point I can’t tell if you honestly don’t understand how proof of work functions, are defensive of the project because you have some special interest, or are just trolling.
Regardless, anyone considering using Anubis should be aware that the project has the same PoW design as Bitcoin, and if you believe cryptocurrencies are bad for the environment, then you want you start away from Anubis and sites that use it.
Also note that the project is a revenue generator for the authors (check the bottom of the github page), so you might see some astro turfing.
I’m not talking about your application at all; I was responding to @chrash0’s comment that JSON may not be great, but it’s better than YAML, and TOML is better than both, for configuration.
I was agreeing with them and adding more reasons why YAML stinks.
Nothing at all directly to do with your project, just having a convo with @chrash0.


Sourcehut also supports Mercurial, so you also have an option to the herd mentality.
Sourcehut also has zero, or almost zero, JavaScript in the interface, so it doesn’t suck
Sourcehut is also componentized, so you can mix and match the pieces you want or need:
Sourcehut is by far the best hosted VCS option at the moment. The Mercurial support alone puts it miles ahead of the others, which are all hobbled by tight coupling to git.


So, you’re basically running the KDE infrastructure, just not using the KDE WM? Have you done a ps and counted the number of KDE services that are running, just to run KDE Connect?
Here are the (KDE) dependencies on the Arch KDE Connect package:
kcmutils
kconfig
kcoreaddons
kcrash
kdbusaddons
kdeclarative
kguiaddons
ki18n
kiconthemes
kio
kirigami
kirigami-addons kitemmodels
kjobwidgets
knotifications
kpeople
kservice
kstatusnotifieritem kwidgetsaddons
kwindowsystem
pulseaudio-qt
qqc2-desktop-style
qt6-base
qt6-connectivity
qt6-declarative
qt6-multimedia
qt6-wayland
When you run KDE Connect, you’re running most of the KDE Desktop and Qt; you’re just not using it.
Have you ever tried running it headless? I have; it doesn’t work.


Yeah, tarpits. Or, even just intentionally fractionally lagging the connection, or putting a delay on the response to some mime types. Delays don’t consume nearly as much processing as PoW. Personally, I like tar pits that trickle out content like a really slow server. Hidden URLs that users are not likely to click on. These are about the least energy-demanding solutions that have a chance of fooling bots; a true, no-response tarpit would use less energy, but is easily detected by bots and terminated.
Proof of work is just a terrible idea, once you’ve accepted that PoW is bad for the environment, which it demonstrably is.


Everything computer does use power. The issue is the same very valid criticism of (most) crypto currencies: the design objectives are only to use power. That’s the very definition of “proof of work.” You usually don’t care what the work is, only that it was done. An appropriate metaphor is: for “reasons”, I want to know that you moved a pile of rocks from one place to another, and back again. I have some way of proving this - a video camera watching you, a proof of a factorization that I can easily verify, something - and in return, I give you something: monopoly money, or access to a web site. But moving the rocks is literally just a way I can be certain that you’ve burned a number of calories.
I don’t even care if you go get a GPU tractor and move the rocks with that. You’ve still burned the calories, by burning oil. The rocks being moved has no value, except that I’ve rewarded you for burning the calories.
That’s proof of work. Whether the reward is fake internet points, some invented digital currency, or access to web content, you’re still being rewarded for making your CPU burn calories to calculate a result that has no intrinsic informational value in itself.
The cost is at scale. For a single person, say it’s a fraction of a watt. Negligible. But for scrapers, all of those fractions add up to real electricity bill impacts. However - and this is the crux - it’s always at scale, even without scrapers, because every visitor is contributing to the PoW total, global cost of that one website’s use of this software. The cost isn’t being noticeable by individuals, but it is being incurred; it’s unavoidable, by design.
If there’s no cost in the aggregate of 10,000 individual browsers performing this PoW, then it’s not going to cost scrapers, either. The cost has to be significant enough to deter bots; and if it’s enough to be too expensive for bots, it’s equally significant for the global aggregate; it’s just spread out across a lot of people.
But the electricity is still being used, and heat is still being generated, and it’s yet another straw on the environmental camel’s back.
It’s intentionally wasteful, and a such, it’s a terrible design.


Are China Parties like Tupperware Parties, where friends get together and one shills a pyramid scheme? That’s what CP is, right?
You also don’t get runtime errors with TOML when an invisible tab turns out to be invisible spaces.
Huh.
tar tfandunzip -l. I’m not sure I’d even bother to write a shell function to combine them, much less install software.Zips just exploding to files is so common, if you just
mkdir unzpd ; unzip -d unzpd file.zipit’s going to be right nearly all of the time. Same with tarballs always containing a directory; it’s just so common it’s barely worth checking.You write the tools you need, don’t get me wrong. This seems like, at most, a 10-line bash function, and even that seems excessive.
function pear() { case $1 in *.zip) unzip -l "$1" ;; *.tar.*) tar tf "$1" ;; esac }