Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

  • intensely_human@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    10 months ago

    What do you mean it has no world model? Of course it has a world model, composed of the relationships between words in language that describes that world.

    If I ask it what happens when I drop a glass onto concrete, it tells me. That’s evidence of a world model.

    • fidodo@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      edit-2
      10 months ago

      A simulation of the world that it runs to do reasoning. It doesn’t simulate anything, it just takes a list of words and then produces the next word in that list. When you’re trying to solve a problem, do you just think, well I saw these words so this word comes next? No, you imagine the problem and simulate it in both physical and abstract terms to come up with an answer.

    • EpeeGnome@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      10 months ago

      I can see the argument that it has a sort of world model, but one that is purely word relationships is a very shallow sort of model. When I am asked what happens when a glass is dropped onto concrete, I don’t just think about what I’ve heard about those words and come up with a correlation, I can also think about my experiences with those materials and with falling things and reach a conclusion about how they will interact. That’s the kind of world model it’s missing. Material properties and interactions are well enough written about that it simulates doing this, but if you add a few details it can really throw it off. I asked Bing Copilot “What happens if you drop a glass of water on concrete?” and it went into excruciating detail about how the water will splash, mentions how it can absorb into or affect the curing of concrete, and now completely fails to notice that the glass itself will strike the concrete, instead describing the chemistry of how using “glass (such as from the glass of water)” as aggregate could affect the curing process. Having a purely statistical/linguistic world model leaves some pretty big holes in its “reasoning” process.