• hackris@lemmy.ml
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    I’ve had this idea for a long time now, but I don’t know shit about LLMs. GPT can be run locally though, so I guess only the API part is needed.

    • boonhet@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      I’ve run LLMs locally before, it’s the unified API for digital assistants that would be interesting to me. Then we’d just need an easy way to acquire LLMs that laymen could use, but probably any bigger DE or distro can create a setup wizard.