According to research done by the AWS Shield Threat Research Team, up to 51% of traﬃc heading into typical web applications originates from scripts running on machines, also known as bots. A wide variety of bots – some wanted, some unwanted – are hitting your endpoints.
Wanted bots are crawling your sites to index them and make them discoverable by your customers; others are monitoring your site availability or performance. But most of the bot traffic is generated by unwanted bots: scripts probing for vulnerabilities, or copying your content to replicate it somewhere else without your consent. In addition to the security risk, serving this traffic causes unnecessary pressure on, and costs for, your infrastructure.
Protecting your website from this unwanted traffic is time-consuming and error-prone. Managing a set of rules is complex, with risks of blocking good traffic or authorizing traffic that should be blocked.