Building on an anti-spam cybersecurity tactic known as tarpitting, he created Nepenthes, malicious software named after a carnivorous plant that will “eat just about anything that finds its way inside.”

Aaron clearly warns users that Nepenthes is aggressive malware. It’s not to be deployed by site owners uncomfortable with trapping AI crawlers and sending them down an “infinite maze” of static files with no exit links, where they “get stuck” and “thrash around” for months, he tells users. Once trapped, the crawlers can be fed gibberish data, aka Markov babble, which is designed to poison AI models. That’s likely an appealing bonus feature for any site owners who, like Aaron, are fed up with paying for AI scraping and just want to watch AI burn.

      • vrighter@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 day ago

        what part of “they do not repeat” do you still not get? You can put them in a list, but you won’t ever get a hit ic it’d just be wasting memory

        • LovableSidekick@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          8 hours ago

          I get that the Internet doesn’t contain an infinite number of domains. Max visits to a each one can be limited. Hel-lo, McFly?

          • vrighter@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            1
            ·
            55 minutes ago

            it’s one domain. It’s infinite pages under that domain. Limiting max visits per domain is a very different thing than trying to detect loops which aren’t there. You are now making a completely different argument. In fact it sounds suspiciously like the only thing I said they could do: have some arbitrary threshold, beyond which they give up… because there’s no way of detecting otherwise

            • LovableSidekick@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              36 minutes ago

              I’m a software developer responding to a coding problem. If it’s all under one domain then avoiding infinite visits is even simpler - I would create a list of known huge websites like google and wikipedia, and limit the visits to any domain that is not on that list. This would eliminate having to track where the honeypot is deployed to.