Let’s be happy it doesn’t have access to nuclear weapons at the moment.

  • ggppjj@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    3 hours ago

    The question before it was a multiple choice question that included “a Harassment threaten to abandon and/or physical or verbal intimidation”.

    Seems like it lost the context of “choosing multiple choice answers” and interpreted the prompt wrong.

    Edit: To be clear: This is a garbage hyped-up auto-complete that has failed at the task it’s being shoved into “fixing”. It’s all-around bad. Google is a bad company, Gemini is a bad product that can’t tell the time. And also, this particular article is over-hyping a reasonably understandable bad response given the context of the input and the understanding that Gemini is a bad product.

    The product (I don’t want to call it AI because it seems like giving it too much credit) isn’t just deciding to tell people that they should die, the product is incapable of rational thought and the math that governs its responses is inherently unstable and provides unavoidably inconsistent results that can never be trusted as fit for any purpose where reasoning is required.

    I want to be mad at it for the right reasons and be raging against Google and the current crop of technocratic grifters with my eyes wide open and my course set to ram into them instead of blindly raging off in the wrong direction.