Google Could Transform Its Search with Machine Learning Technology

essidsolutions

In a bid to improve the accuracy of results returned by its search engine, Google is now employing machine learning algorithms to determine a user’s intent.

Run over neural networks called transformers, the search and advertising giant’s technique for natural-language processingOpens a new window examines the words in a query in context rather than parsing them one-by-one.

That way, the search engine is better able to divine a questioner’s purpose when generating links to relevant web pages.

Google named its innovation Bert – short for Bidirectional Encoder Representation and Transformer – and says it can improve accuracy by 10% over algorithms currently in use when responding to queries sent in the form of questions.

Given that Google’s servers receive up to 5.6 billion queries each day, the reduction in processing volume and allied energy costs promise to be significant by the time Bert is rolled out for all the languages used on the service.

While the company typically kept mum on such statistics, a blog postOpens a new window announcing the new algorithm described the advance as one of the most significant developments in search-engine technology. That’s thanks to both its handling of inputs and the architecture on which it operates.

Sequential deficiencies

The search engine relies on recurrent neural networks, or RNNsOpens a new window , to process algorithms that look for broad-brush subjects as they move sequentially through an input string. Encoding inputs and decoding outputs, their algorithms identify qualifiers that indicate how best to narrow the suggested responses.

A technology called Long Short-Term Memory helps the recurrent neural networks retain information as they proceed sequentially through the words in a search field.

The decisions become more complicated when queries are posed in forms that separate subjects and qualifiers, like written sentences. This explains the preference for keywords that help algorithms provide links to web pages.

Still more complex, their black-box construction makes it difficult to root out errors that arise from the biases that can occurOpens a new window in RNN training, such as the passing over of prepositions which indicate intent and pronouns that replace subjects in longer sentences. Even with Long Short-Term Memory technology, algorithms often are unable to detect those relationships.

Attention to detail

Bert’s transformersOpens a new window address those problems with the use of attention mechanisms. They encode words in a search field with numeric scores, and the transformers then assign each a weighted average used to pinpoint its meaning.

Meanings are refined by comparing the average to the initial score in parallel process that considers words in context instead of sequential order. This allows Bert to move in both directions along the search field to work out user intent.

Instead of skipping over prepositions and pronouns, Bert’s natural-language processingOpens a new window incorporates them into the decoded links on its results pages. This means that users can pose questions in sentence form and expect similar rates of search success as with the use of keywords.

Structural Support

To meet the demand for parallel processing, Google is using its tensor processor unit chipsOpens a new window to form BERT’s neural network. Each chip has a pair of cores configured for machine learning, cutting times from days to hours to train networks to handle the linear algebra that underpins those models.

Google uses the textual corpusOpens a new window of Wikipedia to train Bert’s models for searching conversational queries its other algorithms have difficulty in processing. However, the company says anyone can train a question-answering system using the technology that it opened for community development last year.

Initially implemented for use with American English, Bert will augment search in other languages and locales over time, Google says.

With natural-language processing fast becoming a part of the human-machine interface, experts foresee the dayOpens a new window when Bert might rival Apple’s Siri and Amazon’s Alexa in the market for personal digital assistants.