corbin@infosec.pub to Lemmy Shitpost@lemmy.worldEnglish · 9 days agocan't beat the classicsinfosec.pubimagemessage-square43linkfedilinkarrow-up1439arrow-down110
arrow-up1429arrow-down1imagecan't beat the classicsinfosec.pubcorbin@infosec.pub to Lemmy Shitpost@lemmy.worldEnglish · 9 days agomessage-square43linkfedilink
minus-squarelmuel@sopuli.xyzlinkfedilinkEnglisharrow-up8·8 days agoWell in some ways they are. It also depends a lot on the hardware you have of course. A normal 16GB GPU won’t fit huge LLMs. The smaller ones are getting impressively good at some things but a lot of them are still struggling when using non-English languages for example.
Well in some ways they are. It also depends a lot on the hardware you have of course. A normal 16GB GPU won’t fit huge LLMs.
The smaller ones are getting impressively good at some things but a lot of them are still struggling when using non-English languages for example.