The company is incorporating new software that better understands subtleties of language, with the biggest changes for queries outside the US.
Google says it has enhanced its search-ranking system with software called BERT, or Bidirectional Encoder Representations from Transformers to its friends.
Pandu Nayak, Googles vice president of search, said at a briefing Thursday that the muppet-monickered software has made Googles search algorithm much better at handling long queries, or ones where the relationships between words are crucial. Youre now less likely to get frustrating responses to queries dependent on prepositions like for and to, or negations such as not or no.
This is the single biggest positive change weve had in the last five years, Nayak saidat least according to Googles measures of how ranking changes help people find what they want.
One illustration of BERTs power offered up by Google is how it helped its search engine interpret the query Parking on hill with no curb.
To a human, thats a clear attempt to discover the requirements for Brazilians heading to the US, but pre-BERT Google misunderstood the crucial to and returned an article about US citizens traveling to Brazil as the top result.
Google says it receives billions of searches per day and that the BERT upgrade will affect rankings on one out of every 10.
Anyone who has tried to switch search engines knows that the way Googles ranking burrows into your expectations of the internet can be extremely powerful.
The WIRED conversation illuminates how technology is changing every aspect of our livesfrom culture to business, science to design.
We use cookies and analyse traffic to this site. By continuing to use this site, closing this banner, or clicking "I Agree", you agree to the use of cookies. Read our privacy poplicy for more information.