• sp3ctr4l@lemmy.zip
    link
    fedilink
    arrow-up
    3
    ·
    4 hours ago

    It is not that it responded “Sorry, I cannot find anything like what you described, here are some things that are pretty close.”

    It affirmatively said “No, no such things as you describe exist, here are some things that are pretty close.”

    There’s a huge difference between a coworker saying “Dang man, I dunno, I can’t find a thing like that.” and “No, nothing like that exists, closest to it is x y z,”

    The former is honest. The latter is confidently incorrect.

    Combine that with “Wait what about gamma?”

    And the former is still honest, and the latter, who now describes gamma in great detail and how it meets my requirements, is now an obvious liar, after telling me that nothing like that exists.

    If I now know I am dealing with a dishonest interlocutor, now I am forced to consider tricking it into being homest.

    Or, if I am less informed or more naive, I might just, you know, believe it the first time.

    A standard search engine that is not formatted to resemble talking to a person does not prompt a user to expect it to act like a person, and thus does not suffer from this problem.

    If you don’t find what you’re looking for, all that means is you did not find it.

    If you are told that no such thing exists, a lot of people are going to believe that no such thing exists.

    That is typically called spreading disinformation, when the actor knows what they are claiming is false.

    Its worse than unhelpful, it actively spreads lies.

    Anyway, I’m sorry that you don’t see humor in multi billion dollar technology failing at achieving its purported abilities, I laugh all the time at poorly designed products, systems, things.

    Finally, I did not use the phrase in contention in my original post.

    I used it in my response to you, specifically and only within a single sentence which revolved around incompetent executives.

    It appears that reading comprehension is not your strong suit, maybe you can ask Gemini about how to improve it.

    Err, well, maybe don’t do that.

    • archomrade [he/him]@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 hours ago

      reading comprehension

      Lmao, there should also be an automod rule for this phrase, too.

      There’s a huge difference between a coworker saying […]

      Lol, you’re still talking about it like it’s a person that can be reasoned with bud. It’s just a piece of software. If it doesn’t give you the response you want you can try using a different prompt, just like if google doesn’t find what you’re looking for you can change your search terms.

      If people are gullible enough to take its responses as given (or scold it for not being capable of rational thought lmao) then that’s their problem - just like how people can take the first search result from google without scrutiny if they want to, too. There’s nothing especially problematic about the existence of an AI chatbot that hasn’t been addressed with the advent of every other information technology.