corbin@infosec.pub to Lemmy Shitpost@lemmy.worldEnglish · 2 months agocan't beat the classicsinfosec.pubimagemessage-square43linkfedilinkarrow-up1440arrow-down111
arrow-up1429arrow-down1imagecan't beat the classicsinfosec.pubcorbin@infosec.pub to Lemmy Shitpost@lemmy.worldEnglish · 2 months agomessage-square43linkfedilink
minus-squarelmuel@sopuli.xyzlinkfedilinkEnglisharrow-up8·2 months agoWell in some ways they are. It also depends a lot on the hardware you have of course. A normal 16GB GPU won’t fit huge LLMs. The smaller ones are getting impressively good at some things but a lot of them are still struggling when using non-English languages for example.
Well in some ways they are. It also depends a lot on the hardware you have of course. A normal 16GB GPU won’t fit huge LLMs.
The smaller ones are getting impressively good at some things but a lot of them are still struggling when using non-English languages for example.