The title is not a rhetorical question, and I’m not going to bury an answer. I don’t have an answer. This post is my exploration of the question, and why I think it is a question.

  • KingRandomGuy@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 days ago

    I’m fairly certain blockchain GPUs have very different requirements than those used for ML, especially not LLMs. In particular they don’t need anywhere as much VRAM and generally don’t require floating point math, nor do they need features like tensor cores. Those “blockchain GPUs” likely didn’t turn into ML GPUs.

    ML has been around for a long time. People have been using GPUs in ML since AlexNet in 2012, not just after blockchain hype started to die down.