• Piatro@programming.dev
    link
    fedilink
    English
    arrow-up
    9
    ·
    5 hours ago

    The “make shit up” machine was found to be making shit up? Huh, if only we could have predicted this!

  • Stopwatch1986@lemmy.ml
    link
    fedilink
    English
    arrow-up
    6
    ·
    7 hours ago

    A policy I saw coming out of an NHS (UK) department mandated ‘human-in-the-loop’ which is essentially what the article mentions in the end. The risk is that over time clinicians may become complacent with ‘good enough’ and don’t bother to review thoroughly. And it may be easy to spot mistakes, but not necessarily omissions unless you keep your own notes. More so after a long session, although medical appointments are typically short and focused.

    On a positive note, in my experience clinicians using LLMs do indeed spend more time engaging with service users. In an ideal world, they would be given time to engage and take notes, but this is not going to happen.

  • Ech@lemmy.ca
    link
    fedilink
    arrow-up
    33
    ·
    12 hours ago

    It’s definitely making things up. That’s how they work.

  • Mothra@mander.xyz
    link
    fedilink
    arrow-up
    3
    ·
    9 hours ago

    Fortunately last time I saw my doctor I saw her type everything herself as I spoke.

  • inari@piefed.zip
    link
    fedilink
    English
    arrow-up
    3
    ·
    11 hours ago

    Where’s that lemming the other day who was defending doctors using LLMs?