OurAirports is coping well with the traffic from the AI bots (about 45K hits/hour), but since page views are database-intensive, I've future-proofed it by adding a Varnish cache in front.
Since it's a high view/low update site, I'm putting the AI bots to work seeding the cache for me. I've set a 6 hour maxage and a lot of fancy invalidation for updates (which I hope will work).
Logged in users will skip the cache, while the bots will stay busy seeding it for each-other.
I don't see any reason to poison the well for the bots in this specific case. OurAirports.com is #publicDomain #openData, and everyone (even AI) is welcome to use it. I just want to make sure they don't slow it down for human users.
After the first 15 minutes, the hitrate for the cache is running at only around 40%, but this is a huge site (probably close to a million URLs, all told), so it will take a while for the AI bots to seed all the pages into the cache. I'll check again after my tea break to see how it's coming along.