BERT is described as a “pre-trained deep learning natural language framework that has given state-of-the-art results on a wide variety of natural language processing tasks.”
It is a Google algorithmic update, currently impacting 10% of queries in search results and affecting featured snippet results in countries where they were present.
Google describes BERT as the largest change to its search system since the company introduced RankBrain.
What Does It Affect?
It’s about semantics. It now means that words which are similar “types of things” are thought to have ‘semantic similarity’. For example, a car and a lorry have semantic similarity because they are both types of vehicles.
Relatedness is different from semantic similarity. For example, a car is similar to a lorry since they are both vehicles, but a car is related to concepts of “road” and “driving.” So, you might expect to find a car mentioned in amongst a page about road and driving, or in a page sitting nearby (linked or in the section – category or subcategory) a page about a car.
What does BERT mean for SEO?
There have been so many articles written about BERT recently, but essentially it is designed to improve human language understanding for machines
From a search perspective, this could be in written or spoken queries entered by search engine users. BERT is mostly about resolving linguistic ambiguity in natural language.
BERT is not an algorithmic update like Penguin or Panda since BERT does not judge pages negatively or positively. Instead it improves the understanding of human language for Google search. It means that Google understands more about the meaning of content on pages and also the queries users enter by taking a word’s full context into consideration.
So, Can We Optimise For It?
It is highly unlikely any search engineer questioned could explain the reasons why something like BERT would make the decisions it does with regards to rankings (or anything).
BERT is thought to not always know why it makes decisions itself. How are SEOs then expected to try to “optimise” for it?
So What Can We Do?
BERT is designed to understand natural language – so keep it natural.
We should continue to create compelling, engaging, informative and well-structured content and build our website architectures in the same way you would for humans. This is a positive change.
Google simply got better at understanding the context in sentences and phrases and will become increasingly better over time.