

And as soon as you enter corporate stuff, LLMs are useless again, because most things are integrated into existing ecosystems which LLMs don’t know and/or libraries are only used for closed source code.
And as soon as you enter corporate stuff, LLMs are useless again, because most things are integrated into existing ecosystems which LLMs don’t know and/or libraries are only used for closed source code.
Are three lettuce heads enough for you?
Accurate
PowerfulTurtle and (Not) BurningTurtle meetup
M$ and Google “products” are also widely used, also by “smart” folk.
I’d guess that, like all tech that’s highly locked down, it will be very hard to do anything with - like Apple devices.
So the only thing to do right now is - not updating.
So we need to be careful with upper- and lowercase. Meanwhile the docs: > settiings
Ollama is simple too, I meant that containers make everything a nightmare to maintain.
Huh? I’m talking about existing code being in a dir, then initting a git repo there, creating a pendant on your hoster of choice and then pushing it there. Wouldn’t cloning the repo from step 3 to the code from step 1 overwrite the contents there?
Never have heard of that, but in the case of you also having a Readme that will be even more complicated, I imagine. So just adding -f is the easier option.
Yeah, a coworker (also a trainee) spent 2 days trying to debug some C# MVC thing. It took me around 5 mins, from having last seen C# code 7 years ago, to realizing that the quotes were part of the literal string and needed to be checked too.
Well he did literally everything with the internal ChatGPT instance (or so a coworker said, I don’t know which model actually runs there). I asked if he wrote JS code, he said no. Well even though there was JS in the cshtml file, he technically didn’t lie, as he didn’t write it.
Tbf you have to do that for the first push, if a Readme file was autogenerated
“Developer”
“my” 4 months of “work”
Those are the ones easily replaced by AI. 99% of stuff “they” did was done by AI anyway!
Really easy to start running it
Then everything goes wrong, from configuration over logs to cuda. And the worst fucking debugging ever.
Why not throw that into a VM with VFIO passthrough, plug the GPU in via an external dock and if we are already at abstracting shit away for unnecessary complexity and non-compatibility do all that on windows?
Schadenfreude
Obviously Atlantis!
My fucking events table of my synapse DB in postgres is nearly ten times as large, and I ported that from sqlite no long ago, in a matter of minutes. All of the data is on a 2*3 cluster of old 256GB SSDs, equaling about 1.5TB with Raid 0. That’s neither really fast, nor cool. But stable.
The thing is, people apparently need to get hurt - a lot - for something to change. So let some Nazis kill themselves in Nazimobiles. The only issue is the anticipated deaths of innocents, but those will accelerate resistance just more.
Stop calling “Vibe Coding” “Coding”. It’s as much coding as shitting on a plate is cooking.