

Except that the numbers are also prone to change, like if it’s been stolen. They’re technically not supposed to be an identification code anyhow.
Except that the numbers are also prone to change, like if it’s been stolen. They’re technically not supposed to be an identification code anyhow.
Yes, was poking fun at Ed’s only error message being a relatively unhelpful ?
.
?
It’s also self explanatory, which is great if you’re new.
Ed and Vim are basically arcane by comparison.
Does it count as user error if the user has to micromanage the compiler?
?
Not just, but he literally advertised himself as not being technical. That seems to be just asking for an open season.
Out of memory/overheating in 60k rows? I’ve had a few multi-million row databases that could fit into a few gigs of memory, and most modern machines have that much in RAM. A 60k query that overheats the machine might only happen if you’re doing something weird with joins.
Plus a lot of reads is nothing really, for basically all databases, unless you’re doing an unsmart thing with how you’re reading it (like scanning the whole database over and over). If you’re not processing the data, it’d be I/O bottlenecked.
“Software Engineer” was literally right next to it.
If they went into uni straight out of high school, they could. A lot of Bachelor holders would be around that age, since they start at 18.
In my experience, the only time that I’ve taxed a drive when doing a database query is either when dumping it, or with SQLite’s vacuum, which copies the whole thing.
For a pretty simple search like OP seems to be doing, the indices should have taken care of basically all the heavy lifting.
Hard Drives might do it if the enclosure is poorly designed (no ventilation), but I can’t imagine a situation where it would overheat like that that quickly, even in a sealed box. 30k is nothing in database terms, and if their query was that heavy, it would bottleneck on the CPU, and barely heat the drive at all.
Unless the database was designed by someone who only knows of data as that robot from Star Trek, most would be absolutely fine with 60k rows. I wouldn’t be surprised if the machine they’re using caches that much in RAM alone.
Unless they actually mean the hard drive, and not the computer. I’ve definitely had a cheap enclosure overheat and drop out on me before when trying to seek the drive a bunch, although it’s more likely the enclosure’s own electronics overheating. Unless their query was rubbish, a simple database scan/search like that should be fast, and not demanding in the slightest. Doubly so if it’s dedicated, and not using some embedded thing like SQLite. A few dozen thousand queries should be basically nothing.
Not quite that, but more the ${variable##.*}
sort of thing.
Bash substitution is regex-level wizardry.
Probably downloaded it from ChatGPT, that’s why it’s in the downloads folder.
Specifying paths with - would be its own special brand of hell.
Ah yes, I too, program in the Language programming language
That’s the customer answer, where they give an age in leap years, and everything goes to pot.