If an LLM can’t be trusted with a fast food order, I can’t imagine what it is reliable enough for. I really was expecting this was the easy use case for the things.

It sounds like most orders still worked, so I guess we’ll see if other chains come to the same conclusion.

  • Mac@mander.xyz
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 day ago

    Why can’t a trillion dollar AI say “Sir, that’s not reasonable”?