In today’s episode, Yud tries to predict the future of computer science.

  • dr2chase@ohai.social
    cake
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    @corbin I got a 96GB laptop just so I could run (some) LLMs w/o network access, I’m sure that will be standard by 2025.🤪

    • corbin@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      Let me know @corbin@defcon.social if you actually get LLMs to produce useful code locally. I’ve done maybe four or five experiments and they’ve all been grand disappointments. This is probably because I’m not asking questions easily answered by Stack Overflow or existing GitHub projects; LLMs can really only model the trite, not the novel.