« Back to Glossary Index

A bot is an automated piece of software that systematically works its way through the internet to collect information about websites. They are also known as spiders or web crawlers and are used by search engines to crawl pages they find online that can then be added to the search engine’s database, ready for indexing.

Most bots are good bots that complete repetitive tasks to gather information. However, there are bad bots that can have a substantial negative impact on a website by inflating traffic, causing server downtime (DDoS attacks) and scraping content.

Search engine bots use internal links to follow your architecture and determine which pages to crawl next and how they are related.

web crawler, spider
« Back to Glossary Index

Subscribe To Tech SEO Tips Newsletter

The latest news from the SEO industry, plus tips and discussions on improving your tech SEO performance.