• pinkapple@lemmy.ml
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    9
    ·
    2 days ago

    Kay, and that has nothing to do with what i said. Scrapers, bots =/= AI. It’s not even the same companies that make the unfree datasets. The scrapers and bots that hit your website are not some random “AI” feeding on data lol. This is what some models are trained on, it’s already free so it’s doesn’t need to be individually rescraped and it’s mostly garbage quality data: https://commoncrawl.org/ Nobody wastes resources rescraping all this SEO infested dump.

    Your issue has everything to do with SEO than anything else. Btw before you diss common crawl, it’s used in research quite a lot so it’s not some evil thing that threatens people’s websites. Add robots.txt maybe.

    • TheOneCurly@lemm.ee
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      2
      ·
      2 days ago

      Oh ok I’ll just ignore the constant requests from GPTBot, ByteSpider, and the hundreds of others who very plainly, sometimes in their useragent, tell you that they’re grabbing content for training data. Robots.txt is nice and all but manually adding every single up and coming AI company is impossible. Like I said Anubis is the first time I’ve gotten them all to even remotely calm down.

      • pinkapple@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 hours ago

        Bots only identify themselves and their organization in the user agent, they don’t tell you specifically what they do with the data so stop your fairytales. They do give you a really handy url though with user agents and even IPs jn json if you want to fully block the crawlers but not the search bots sent by user prompts.

        Your ad revenue money can be secured.

        https://platform.openai.com/docs/bots/

        If for some reason you can’t be bothered to edit your own robots.txt (because it’s hard to tell which bots are search bots for muh ad money) then maybe hire someone.

        • TheOneCurly@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 hour ago

          Lmao you linked to the same page I did where this text appears:

          GPTBot is used to make our generative AI foundation models more useful and safe. It is used to crawl content that may be used in training our generative AI foundation models.

          Also you’re so capitalism brained you assume anyone running a website must be doing so for profit. My hobby projects (personal homepage and personal git forge) were getting slammed by bots while I just paid the bills. I could have locked them both behind an auth portal but then I might as well just take them off the internet and run everything on my LAN.