qaz@lemmy.world to Programmer Humor@programming.devEnglish · 13 days agoSeptlemmy.worldimagemessage-square71linkfedilinkarrow-up1651arrow-down15
arrow-up1646arrow-down1imageSeptlemmy.worldqaz@lemmy.world to Programmer Humor@programming.devEnglish · 13 days agomessage-square71linkfedilink
minus-squaremcv@lemmy.ziplinkfedilinkarrow-up2·11 days agoBut if that’s how you’re going to run it, why not also train it in that mode?
minus-squareXylight@lemdro.idlinkfedilinkEnglisharrow-up2·11 days agoThat is a thing, and it’s called quantization aware training. Some open weight models like Gemma do it. The problem is that you need to re-train the whole model for that, and if you also want a full-quality version you need to train a lot more. It is still less precise, so it’ll still be worse quality than full precision, but it does reduce the effect.
minus-squaremudkip@lemdro.idlinkfedilinkEnglisharrow-up1arrow-down1·9 days agoYour response reeks of AI slop
minus-squaremudkip@lemdro.idlinkfedilinkEnglisharrow-up1·9 days agoIs it, or is it not, AI slop? Why are you using so heavily markdown formatting? That is a telltale sign of an LLM being involved
minus-squareXylight@lemdro.idlinkfedilinkEnglisharrow-up2arrow-down1·9 days agoI am not using an llm but holy bait Hop off the reddit voice
minus-squaremudkip@lemdro.idlinkfedilinkEnglisharrow-up1·8 days ago…You do know what platform you’re on? It’s a REDDIT alternative
minus-squarepsud@aussie.zonelinkfedilinkEnglisharrow-up1·8 days ago heavily markdown formatting They used one formatting mark, and it’s the most common. What are you smoking, and may I have some?
But if that’s how you’re going to run it, why not also train it in that mode?
That is a thing, and it’s called quantization aware training. Some open weight models like Gemma do it.
The problem is that you need to re-train the whole model for that, and if you also want a full-quality version you need to train a lot more.
It is still less precise, so it’ll still be worse quality than full precision, but it does reduce the effect.
Your response reeks of AI slop
4/10 bait
Is it, or is it not, AI slop? Why are you using so heavily markdown formatting? That is a telltale sign of an LLM being involved
I am not using an llm but holy bait
Hop off the reddit voice
…You do know what platform you’re on? It’s a REDDIT alternative
They used one formatting mark, and it’s the most common. What are you smoking, and may I have some?