• FaceDeer@kbin.social
    link
    fedilink
    arrow-up
    45
    arrow-down
    2
    ·
    edit-2
    11 months ago

    Sounds like nothing particularly unusual or alarming. Researchers found a few thousand images that could be illegal that were referenced by it, told LAION about it, and LAION pulled the database down temporarily while checking and removing them. A few thousand images out of five billion is not significant.

    There’s also the persistent misunderstanding of what the LAION database is, which is even perpetuated by the paper itself (making me suspicious of the researchers’ motivations since they surely know better). The paper says: “We find that having possession of a LAION‐5B dataset populated even in late 2023 implies the possession of thousands of illegal images—not including all of the intimate imagery published and gathered non‐consensually, the legality of which is more variable by jurisdiction,” When the LAION-5B dataset doesn’t actually have any pictures at all in. It’s purely a list of URLs pointing at images that are on the Internet, each with text describing them. Possessing the dataset doesn’t make you in possession of any of those images.

    Edit: Yeah, down at the bottom of the article I see the researcher state that in his opinion LAION-5B shouldn’t even exist and use inaccurate emotionally-charged language about how AI training data is “stolen.” So there’s the motivation I was suspicious of.

    • Zarxrax@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      11 months ago

      While I get what you are saying, it’s pretty clear that what he was saying was that if you actually populate the dataset by downloading the images contained in the links (which anyone who is actually using the dataset to train a model would need to do), then you have inadvertantly downloaded illegal images.

      It is mentioned repeatedly in the article that the dataset itself is simply a list of urls to the images.

    • General_Effort@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      11 months ago

      Makes one wonder if there is some lobby org behind this. The benefits to major corporate interests are obvious, and it feels a little campaigny.

    • SineSwiper@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 months ago

      This new “journalism” site is not doing itself any favors with bullshit headlines like this. And this is not the first wildly inaccurate article I’ve seen from 404 Media.

    • LWD@lemm.ee
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      8
      ·
      11 months ago

      inaccurate emotionally-charged language about how AI training data is “stolen.” So there’s the motivation I was suspicious of.

      Conversely, it is a strongly contested opinion that taking intellectual property without consent is acceptable. You can have that strongly contested opinion, but I don’t think you should present it as default, neutral, or moderate.

        • LWD@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          11 months ago

          What would be the most politically correct word to use? To most people, taking without consent or credit is considered theft.

          But apparently to AI enthusiasts, such an insinuation is offensive.

          • FaceDeer@kbin.social
            link
            fedilink
            arrow-up
            6
            arrow-down
            1
            ·
            11 months ago

            “Copyright violation” is probably the wording you’re looking for. Copyright violation is not taking or theft or stealing or any of those other words - it’s copyright violation.

            Whether training an AI on a copyrighted work without permission of the copyright holder is a violation of copyright is something that is debatable. But it most definitely is not stealing or theft. Theft is covered by completely different laws.