• arthurno1@fediverser.communick.devB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    I think it is a really bad example. What it showcases is that LLMs are basically a hardcoded web search. Given some tokens (words in human language) they can generate some data in some form. Which is a good thing in itself. But what is not so good here is that you still have to know the CSS syntax, still have to know what you are looking for, still have to type most of it, if not more, and have to learn how to generate that from the llm. It is a bit like typing a skeleton or a yasnippet template in a buffer and then immediately calling the expansion from the minibuffer to generate some code for you. Perhaps it is a good automation, IDK yet, haven’t used it myself, but I don’t think the particular example you have used illustrates it well.