Check out Google’s BERT if you want to keep your website Ernie-ing money

GOOGLE HAS announced the biggest change to its search algorithm for five years, which is expected to give better results for the end-user, but it could see up to ten per cent of search results shift in their ranking.

The update is based on BERT (Bidirectional Encoder Representations from Transformers), the system Google devices use to parse natural language.


By applying the same technology to search strings, Google is able to provide a better insight into context.

That means that in a nonsense search like “Inquirer Google News stories”, it will (hopefully) work out that the word “news” isn’t superfluous to “The INQUIRER” part, but rather we want to find stories about Google News. That’s the theory anyway.

At launch, the changes apply to searches in US English, but the rest of the world will catch up soon enough.


As ever, the trick has been in how to train the AI. The secret is to give BERT a series (millions) of search results with a missing chunk and let it learn to deduce the rest, over and over again until it gets it right.

Google has batted away concerns that the old adage of “garbage in – garbage out” could mean that we end up with some Tay-alike running out Google searches, saying that it has worked tirelessly to ensure that no biases creep in – not all searches will be affected by the change, and there are human beans keeping an eye for any erroneous responses.

While this is great news for users, there’ll now be a complex ballet at website publishers like ours across the world, as they attempt to change their SEO to be optimised for the new algorithm.


The problem is that Google keeps the secret sauce of its searches a closely guarded secret, so it’s only down to the fact that we’re (well, Carly is) utterly skillz at SEO that we’re not worrying more. µ