

Different mindset. A user doesn’t want to find bugs but get shit done.
Different mindset. A user doesn’t want to find bugs but get shit done.
And we absolutely wanted to shoot the tester who gave us this use case.
Why? Because he tested well and broke the software? A user changing their mind during a guided activity absolutely is a valid use case.
Yeah, the people at Pixar have no clue how to use a computer. Lol
Do you really expect their artists to be IT experts? You seem to be stuck in the early 90s mindset when “knowing how to use a computer” covered all disciplines.
But it can also be a matter of (inexperienced) devs just deciding, fuck it, I won’t try to merge it, I’ll just copy my changes elsewhere and throw away the repo.
Pretty sure that’s actually it. Git has a learning curve and, for example, some naive rebase not working out as intended can be scary if you don’t know what you’re doing.
It works well - for a Windows subsystem. It is well-integrated but also separate which can be annoying sometimes.
For example, you might code in Python in VSC against a WSL folder but make a script to eventually run in Windows. You need to install and update Python twice then - a Linux and a Windows version (obvious, but can be annoying).
WSL is also really slow, especially for filesystem heavy stuff. You know how on Linux programs sometimes run faster via Wine/Proton than on Windows itself? Yeah, this is the other way around.
It was earlier, when they released Windows 7 and it was the first (and only) release, management gave development a largely free hand and they could bring down some technical debt.
But apparently that didn’t work out for Microsoft and now we get one dystopian news after another.
Isn’t the POSIX incompatibility a major roadblock when scripting?
It must be a really deeply integrated part of the Windows kernel because it has never been able to show progress properly.
Back in the days of floppy disks it always felt that actual copying started when the progress showed 100%.
Makes sense. It’s like having your personal undergrad hobby coder. It may get something right here and there but for professional coding it’s still worse than the gold standard (googling Stackoverflow).
Onfuscators probably use it though, so no spec ever will be able to get rid of this crap.
BASIC. At least VB.
Debian, Mint, Arch (by the way).
Had Ubuntu as my main driver for about 2 years but didn’t like Gnome and had more trouble with an Nvidia card than on Mint or Arch.
Fedora is top of my to-try-list but I’m not a distro-hopper, so who knows when I’ll have a use case.
It’s Proton VPN. Lack of IPv6 support is a downer but I wouldn’t call them shit.
Edit: maybe elaborate why you deem IPv6 so crucial? As I said: everything works just fine without.
What the fuck are you talking about? My ISP supports IPv6 just fine, but following my VPN’s advice I disable it (on certain devices at least) for privacy concerns. And it makes exactly zero difference in functionality.
Auto-“correct”. Thanks, fixed.
Why should we care? So address space may run out eventually - that’s our ISPs’ problem.
Other than that I actually don’t like every device to have a globally unique address - makes tracking even easier than fingerprinting.
That’s also why my VPN provider recommends to disable IPv6 since they don’t support it.
In the long term it works quite a bit better as a valuation tool. Bubbles tend to not last very long.
More like:
Computer scientist: We have made a text generator
Everyone: tExT iS iNtElLiGeNcE
Best to join a C++ community on some social media then. They’ll tell you immediately why “C with classes” isn’t C++.
deleted by creator