• palordrolap@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Put something in robots.txt that isn’t supposed to be hit and is hard to hit by non-robots. Log and ban all IPs that hit it.

    Imperfect, but can’t think of a better solution.

  • KillingTimeItself@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    hmm, i though websites just blocked crawler traffic directly? I know one site in particular has rules about it, and will even go so far as to ban you permanently if you continually ignore them.