There’s an escalating arms race between bots and the people who protect sites from them. Bots, or web scrapers, can be used to gather valuable data, probe large collections of sites for vulnerabilities, exploit found weaknesses, and are often unfazed by traditional solutions like robots.txt files, Ajax loading, and even CAPTCHAs. I’ll give an overview of both sides of the battle and explain what what really separates the bots from the humans. I’ll also demonstrate and easy new tool that can be used to crack CAPTCHAs with high rates of success, some creative approaches to honeypots, and demonstrate how to scrape many “bot-proof” sites.