Google algorithm updates are now increasingly frequent, so when a "big" update comes, everyone tends to stand at attention.
The latest important update to come down the pipeline is fondly named BERT. The announcement came on October 24, 2019, but like other previous releases, it had already hit the net a few days before.
Why is Google BERT so important?
In some ways, BERT is a follow-up to some of the updates relating to voice search and an improvement on the 2015 RankBrain update , which used machine learning to determine the most relevant results for search queries by searching for intent. BERT is the latest in attempts to use AI and machine learning algorithms, which always whip up a frenzy of interest.
What is BERT, and do you need a new strategy to accommodate it? Keep reading to learn more about the latest SEO update.
BERT is a cute acronym that stands for "Bidirectional Encoder Representations from Transformers," which is decidedly less catchy.
The essence of BERT is that it's a machine learning tool that uses neural network-based techniques for natural language processing (NLP). That's a sentence written for neuroscientists and quantum computing engineers. If you want a break down for marketers, check out this resource for a layman's explanation.
In short, these tools try to interpret the intent of human speech like a human would.
Why is this so important?
Usually, Google's algorithms look at search queries on a word-by-word basis. In recent years, long-tail keywords have become more prominent. But these don't always represent what the searcher is looking for exactly. So, Google wants to use BERT to understand the motivation behind the search query to provide better answers.
Is Google trying to read your mind? No, that's ESP - not NLP. Instead, Google wants to help get around human errors or small qualifiers that aren't keywords to better provide results.
For a quick understanding of what this means, let's look at one of their own examples and one of the English language's biggest nemesis: prepositions.
In case you need a refresher, a preposition is a part of speech that expresses a relationship between a noun or pronoun with something else on the clause (often another noun or pronoun).
For example, in Harry Potter and the Sorcerer's Stone, Hermoine says to Ron and Harry, "I hope you're pleased with yourselves. We all could have been killed - or worse, expelled."
The preposition "with" connects Harry and Ron with their hindsight in the sentence. More importantly, the preposition "with" delivers Hermoine's devastating sense of sarcasm. The connection is more critical for BERT, and we can all admit that it would be a very different sentence than "I hope you're pleased."
Of course, this example also implies the use of tone, which is far beyond the kind of search intent currently possible with machine learning or AI, but you get the point (and you got a funny Hermoine quote).
Prepositions are the kinds of things that Google BERT wants to learn about.
Why? Because "on the moon" and "over the moon" are very different things. If you search for "on the moon," you might be looking for the latest NASA news. If you were searching "over the moon," you might be an English nerd.
Google also gave an example of how it plays out in search queries.
Let's say you enter the term "2019 Brazil traveler to USA need visa."
On the U.S. Google page, it would bring about a page telling you whether American citizens need a visa to go to Brazil.
But that's not what the query said. The word "to" is a huge giveaway that the algorithm ignored before Bert. After Bert, the top result is the U.S. Embassy in Brazil, which shows Brazilian nationals who want to go "to USA" how to apply for the relevant visa required.
This isn't a case of "potato, potato," this is a hash brown: the first link was only related to the keywords in the search term, and the second one understood those keywords in their context by reading the preposition.
Another good example provided by Google is the "parking on a hill with no curb" term.
In the past, Google focused on the big word "parking" and "hill," but it ignored curb when it chose the snippets. With the new updated, it acknowledges the whole string provides a much more relevant snippet.
The update impacts 10 percent of all search queries, which is a significant amount compared to past uses. But what does it mean for your SEO strategy?
Let's start with who BERT will and won't help.
BERT won't do any work for a site that's got little direction or has a nonsensical structure. It also won't help poorly written sites. It's machine learning - not a miracle worker.
BERT is also like RankBrain in that it's not something to organize an entire SEO strategy around. Google specifically says optimizing for RankBrain is rather fruitless.
However, if you lean into the concept of search intent and employ it into your on-page SEO and your content marketing, you can make the Google BERT update work for your site and maybe even climb past some of those who still think of BERT as Bert and Ernie. It will also eventually help out eCommerce product pages, but that seems to be further down the pipeline.
What does this mean in real terms? It means optimizing for your users and the use of natural language.
Now that Google is trying to speak like a human (in all its complexities), you have less room to play around with keywords.
Instead, you need to focus both your keyword strategy and the content (both on-page and in your blogs).
Neil Patel provides a stellar example of what that means.
According to his example, if you run a weight loss clinic and you want to target "how to lose weight without taking pills," you need to steer clear of things that look like pills, like supplements or shakes.
But supplements and shakes aren't the same as weight loss pills! We know, but BERT might penalize you for it because the searcher wants to go "without pills," which probably means they don't want fat-burning shakes either. So your best bet is to steer clear from those kinds of topics if you're going to keep that focus keyword.
As you help BERT along, you should be focusing on several principles:
You should be prioritizing these over past worries like:
You can still use these technical approaches, but it's important to remember that BERT is query-focused.
It wants to join together the naturalness of the search query with the human element of your content. That means you can play by all the keyword rules, but if you don't answer the question in a fresh, humanistic way, you won't earn any points with BERT.
You know what straightforward, useful content and specific keywords are, but until now, you might not have considered search intent.
Search intent became prominent with the Hummingbird and RankBrank updates, which attempted to understand what the user was looking for and provided things like the Knowledge Panel.
Broadly, searchers generally have one of four intents in a query:
To meet these searchers, you use the interrogatives (the questions) in places like your titles, headers, and descriptions.
These interrogatives can also help you create topic clusters that better cater to searchers.
What are the benefits?
They filter the right pages to the correct search queries. Someone looking for a free trial doesn't want to end up on your premium product page, so you will see lower bounce rates.
You should also score:
Optimizing for intent is probably the only way to optimize for BERT (and Google says you cant). It will also land you more leads in the process.
Google BERT is here. While it's not exactly game-changing for your SEO strategy, it should improve the search experience for everyone. Plus, it could end up sending poorly-written and unfocused websites down the rankings, which is an opportunity for you.
At Siren Digital, we stay on top of all the latest Google algorithm updates, so you don't have to. Do you have a question about SEO, Google ads, or local optimization? Get in touch, and we'll answer your question in 24 hours.