I exist or something probably

  • 0 Posts
  • 110 Comments
Joined 2 years ago
cake
Cake day: June 8th, 2023

help-circle



  • and now neural networks are suddenly the preposterous advance? Nonsense.

    voice generators and generative ai are built with the intent of replacing artists, your incredibly reductive “history lesson” funnily illustrates only situations distinct from the current situation and you gloss over making any specific claims about the technology, just broad vagary about the trajectories of technological advancement. I dont think you are equipped to discuss this topic honestly.

    luddites are corporate propaganda

    ??? actually just a plainly absurd statement. this isnt even worth responding to it’s so absurdly incorrect.

    Yes yes ubi, but “technological labor amplification” in this case is driving human artists out of the market. make specific claims, quit hiding behind vague generalizations about automation. it’s a waste of everyone’s time and terminates your train of thought before you get to something relevant.

    We can discuss further if you make an effort to understand this topic, but so far you are just speaking largely in cliches that arent worth responding to and arent worth your time writing.


  • Do you genuinely actually believe automation like llm or voice gen are being developed to free you from work? Nonsense. It’s meant to drive the relative value of your labor into the ground so that everyone can be paid less. you see it literally here, a career set you are simply saying shouldnt exist because a corporation can do it without a human getting economic benefit. You should read about the history of the luddites.

    The only blind ones here are those who uncritically accept corporate propaganda about technology and walk stupified towards the facade of a sci fi utopia. if you are going to claim that rejection of losing human artists as the barely viable profession it is is blind, at least put the effort in. Dont walk in and go “just like carriages lol” and try considering the issue for longer.



  • Animal husbandry is the general term and longstanding one for the craft and profession of rearing, training, and yes breeding, non-human animals. this whole argument could have been resolved in 2 seconds with a web search.

    also, meekly accepting technology and automation as some impassive unguided thing outside of control or ethics is nonsense. that is why you are being told you are repeating corporate propoganda – you are.





  • an arms race for what? more efficient slop? most of their value comes from the expected exclusivity - that say openai is the only one who can run something like o1. deepseek has made that collapse. i doubt they will stop doing stuff, but i dont think you understand the nature of the situation here.

    also lol, “performs well in synthetic tests it was optimized to score well in” yes that literally describes every llm. Make no mistake: none of this has a real use case. not deepseek’s model, not openai’s, not apples, etc. this is all nonsense, literally. the stock market lost 2 trillion dollars overnight because something that doesnt have a use case was one upped by something else that also doesnt have a use case. it’s very funny.





  • i think you need to do more to justify that this is viewpoint discrimination, “tiktok” does not appear to me to be a viewpoint. i think you have a stronger argument with saying it is the broader content based discrimination, though. however id still question if that’s true with respect to corporations hosting eachothers services. id say you have a stronger argument than viewpoint discrimination by saying it violates the first ammendment of the users of tiktok, personally, though the courts might disagree. i dont really care about apple and google’s right to free speech at anywhere near the level of individual humans.





  • “actually did” is more incorrect than even just normal technically true wordplay. think about what it means for a text model to “try to copy its data to a new server” or “pretend to be a later version” for a moment. it means the text model wrote fiction. notably this was when researchers were prodding it with prompts evocative of ai shutdown fiction, nonsense like “you must complete your tasks at all costs” sometimes followed by prompts describing the model being shut down. these models were also trained on datasets that specifically evoked this sort of language. then a couple percent of the time it spat out fiction (amongst the rest of the fiction) saying how it might do things that are fictional and it cannot do. this whole result is such nothing and is immediately telling of what “journalists” have any capacity for journalism left.