BERT is described as a “pre-trained deep learning natural language framework that has given state-of-the-art results on a wide variety of natural language processing tasks.”
It is a Google algorithmic update, currently impacting 10% of queries in search results
and affecting featured snippet results in countries where they were present.
Google describes BERT as the largest change to its search system since the company introduced RankBrain
What Does It Affect?
It’s about semantics. It now means that words which are similar “types of things” are thought to have ‘semantic similarity’. For example, a car and a lorry have semantic similarity because they are both types of vehicles.
Relatedness is different from semantic similarity. For example, a car is similar to a lorry since they are both vehicles, but a car is related to concepts of “road” and “driving.” So, you might expect to find a car mentioned in amongst a page about road and driving, or in a page sitting nearby (linked or in the section – category or subcategory) a page about a car.
What does BERT mean for SEO?
There have been so many articles written about BERT recently, but essentially it is designed to improve human language understanding for machines
From a search perspective, this could be in written or spoken queries entered by search engine users. BERT is mostly about resolving linguistic ambiguity in natural language.
BERT is not an algorithmic update like Penguin
since BERT does not judge pages negatively or positively. Instead it improves the understanding of human language for Google search. It means that Google understands more about the meaning
of content on pages and also the queries users enter by taking a word’s full context into consideration.
So, Can We Optimise For It?
It is highly unlikely any search engineer questioned could explain the reasons why something like BERT would make the decisions it does with regards to rankings (or anything).
BERT is thought to not always know why it makes decisions itself. How are SEOs then expected to try to “optimise” for it?
So What Can We Do?
BERT is designed to understand natural language – so keep it natural.
We should continue to create compelling, engaging, informative and well-structured content and build our website architectures in the same way you would for humans. This is a positive change.
Google simply got better at understanding the context in sentences and phrases and will become increasingly better over time.
BERT and Full Coverage.
In September 2020, Google announced the changes they have been making to search over the last year which includes changes to the way BERT is used.
Google is leveraging BERT and one of its language AI models so that it can better understand if stories that are shown in the Google News full coverage area are reliable in terms of the facts that are being used and posted online.
In short, this means that Google can now use the connections between articles and its fact check database to match facts more effectively with stories. It can understand if the fact check is related to the main topic of stories.
Google will surface the articles that fact check more appropriately in more prominent positions in Google News full coverage. According to Pandu Nayak, Google Fellow and Vice President of Google Search: “Full Coverage allows you to see top headlines from a range of different sources, videos, local news reports, FAQs, social commentary and a timeline of store that have played out over time.”