Profile pic is from Jason Box, depicting a projection of Arctic warming to the year 2100 based on current trends.

  • 0 Posts
  • 24 Comments
Joined 1 year ago
cake
Cake day: March 3rd, 2024

help-circle











  • Rhaedas@fedia.iotoProgrammer Humor@lemmy.mlYou don't need the mouse
    link
    fedilink
    arrow-up
    67
    arrow-down
    1
    ·
    5 months ago

    Steve Rogers: Big man in a suit of armour. Take that off, what are you?

    Tony Stark: Genius, billionaire, playboy, philanthropist.

    Tony was being snarky, but he’s not wrong about the suit being just an extension. Yes, it’s important to his abilities, otherwise he wouldn’t have needed it from the beginning in the cave. But it’s also not a crutch, as Ironman 3 showed and taught him, and he’s trying to show Peter that it begins with the person.

    Also keep in mind what he said to Peter in this same scene - when Peter said he just wanted to be like Tony, Tony comes back and says yeah, and I wanted you to be better. Tony knows that Peter truly doesn’t need the suit because he is that powerful, it’s just once again an extension that enhances those abilities, and if Peter thought it was the suit that made him special he wouldn’t grow.



  • The flaw of the question is assuming there is a clear dividing line between species. Evolutionary change is a continuous process. We only have dividing lines where we see differences in long dead ones in the fossil record, or we see enough differences in living ones. The question has no answer, only a long explanation of how that isn’t how any of this works.





  • I tried it with my abliterated local model, thinking that maybe its alteration would help, and it gave the same answer. I asked if it was sure and it then corrected itself (maybe reexamining the word in a different way?) I then asked how many Rs in “strawberries” thinking it would either see a new word and give the same incorrect answer, or since it was still in context focus it would say something about it also being 3 Rs. Nope. It said 4 Rs! I then said “really?”, and it corrected itself once again.

    LLMs are very useful as long as know how to maximize their power, and you don’t assume whatever they spit out is absolutely right. I’ve had great luck using mine to help with programming (basically as a Google but formatting things far better than if I looked up stuff), but I’ve found some of the simplest errors in the middle of a lot of helpful things. It’s at an assistant level, and you need to remember that assistant helps you, they don’t do the work for you.