This may sound like a weird thing to do, but I realised that many crawlers and bots are somehow still able to get past my Anubis. I presume they have gotten smarter and are capable of using JavaScript.

To counter this, I want to link my Anubis to an Iocane setup such that:

Internet > nginx reverse proxy > Anubis > Iocane > my site/app

My hope is that two different filtering mechanisms (one of which will actively poison and waste the bot’s resourced) will protect my system better.

I thought I’d ask before actually trying out something like this.

  • Black616Angel@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 hours ago

    Have you tried fucking with the status codes?

    There is a great defcon talk about that:

    Slides

    Video on Youtube

    So you could e.g. return a 401 and still show the page. Most automated systems will probably ignore the response of an ‘unauthorized’ message.