The title is not a rhetorical question, and I’m not going to bury an answer. I don’t have an answer. This post is my exploration of the question, and why I think it is a question.

  • DasFaultier@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    3 days ago

    If you mean from an energy/climate/water/resource consumption perspective, then no. But of you’re looking at it from a labor perspective, then also no. From a copyrights perspective? Nope as well. Okay, but surely from a correctness perspective? Very clear no. Okay, but there’s still the aspect of showing recipients respect and not wasting their time by giving them something to read/view/process that you didn’t care to write/think through yourself in the first place? Well, you guessed it, hard no as well.

    The things that AI was made for are:

    • finding a use for superfluous Blockchain GPUs when that hype began to die (initially), and more importantly…
    • replacing human workers who demand nasty things like wages and vacation and sick days and rights, in order to redistribute wealth to the wealthiest.
      • trevor@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        Yes, because private property is theft. But unequal enforcement of copyright law is worse. Right now, LLMs are just lying machines trained on pirated data and the companies that run them are acting with impunity for doing something a normal person would get put in jail for.

        Copyright is immoral, but as long as it exists, the laws should be extra strict on companies that steal others’ works.

    • KingRandomGuy@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      I’m fairly certain blockchain GPUs have very different requirements than those used for ML, especially not LLMs. In particular they don’t need anywhere as much VRAM and generally don’t require floating point math, nor do they need features like tensor cores. Those “blockchain GPUs” likely didn’t turn into ML GPUs.

      ML has been around for a long time. People have been using GPUs in ML since AlexNet in 2012, not just after blockchain hype started to die down.