Directives

Directives

Robots’ meta directives (occasionally called “meta tags”) are pieces of code within a website that instruct crawlers on how to crawl or index the content on the web page. Types of Meta Directives There are many different directives that you can choose from, but the most common are: Here is a complete list of directives […]

Canonical URL

Canonical URL

A canonical URL indicates the preferred URL you would like to display for a particular keyword or query. Canonicals can be used to resolve a variety of cannibalisation and duplicate content issues. Simply put, canonicals allow you to specify the URL/domain/version of a web page you would like search engines to index. It is considered […]

Bot

Bot

A bot is an automated piece of software that systematically works its way through the internet to collect information about websites. They are also known as spiders or web crawlers and are used by search engines to crawl pages they find online that can then be added to the search engine’s database, ready for indexing. […]

Subscribe To Tech SEO Tips Newsletter

The latest news from the SEO industry, plus tips and discussions on improving your tech SEO performance.