Skip to main content

RankBrain Helps Google Understand Queries

Google's search engine hasn't always been very smart. In its early years, Google only tried to find the pages that matched the words from your query and ranked them. It's hard to answer a question without understanding it, but that's what Google did.

Google constantly improved its algorithms, added personalization options, started to match synonyms and expand abbreviations, but Knowledge Graph and Hummingbird were the greatest leaps that put machine learning to work and made Google smarter. Google started to understand the meaning behind a question, to disambiguate words and to find answers, not just pages that include the words from the query.

Bloomberg reports that Google uses even more artificial intelligence to answer questions and rank results. RankBrain is a new AI system that has been used for the past few months to improve search results. "If RankBrain sees a word or phrase it isn't familiar with, the machine can make a guess as to what words or phrases might have a similar meaning and filter the result accordingly, making it more effective at handling never-before-seen search queries."

15% of the queries Google gets every day are new and RankBrain helps Google understand them. Here's an example of complicated query: "What's the title of the consumer at the highest level of a food chain?" RankBrain finds words and phrases that have a similar meaning and highlights them (for example: predators). "In the few months it has been deployed, RankBrain has become the third-most important signal contributing to the result of a search query."


Google's CEO, Sundar Pichai, says that "machine learning is a core transformative way by which we are rethinking everything we are doing". Machine learning has already helped Google improve image search, automatic translation, speech recognition and deep learning is already showing some promising results: smarter photo search with object recognition.

"In tandem with other researchers at Google, Andrew Ng is building one of the most ambitious artificial-intelligence systems to date, the so-called Google Brain. This movement seeks to meld computer science with neuroscience — something that never quite happened in the world of artificial intelligence," reports Wired. "Deep Learning is a first step in this new direction. Basically, it involves building neural networks — networks that mimic the behavior of the human brain. Much like the brain, these multi-layered computer networks can gather information and react to it. They can build up an understanding of what objects look or sound like."

Comments

Popular posts from this blog

Women and children overboard

It's the  Catch-22  of clinical trials: to protect pregnant women and children from the risks of untested drugs....we don't test drugs adequately for them. In the last few decades , we've been more concerned about the harms of research than of inadequately tested treatments for everyone, in fact. But for "vulnerable populations,"  like pregnant women and children, the default was to exclude them. And just in case any women might be, or might become, pregnant, it was often easier just to exclude us all from trials. It got so bad, that by the late 1990s, the FDA realized regulations and more for pregnant women - and women generally - had to change. The NIH (National Institutes of Health) took action too. And so few drugs had enough safety and efficacy information for children that, even in official circles, children were being called "therapeutic orphans."  Action began on that, too. There is still a long way to go. But this month there was a sign that ...

Benefits Of Healthy eating Turmeric every day for the body

One teaspoon of turmeric a day to prevent inflammation, accumulation of toxins, pain, and the outbreak of cancer.  Yes, turmeric has been known since 2.5 centuries ago in India, as a plant anti-inflammatory / inflammatory, anti-bacterial, and also have a good detox properties, now proven to prevent Alzheimer's disease and cancer. Turmeric prevents inflammation:  For people who

Not a word was spoken (but many were learned)

Video is often used in the EFL classroom for listening comprehension activities, facilitating discussions and, of course, language work. But how can you exploit silent films without any language in them? Since developing learners' linguistic resources should be our primary goal (well, at least the blogger behind the blog thinks so), here are four suggestions on how language (grammar and vocabulary) can be generated from silent clips. Split-viewing Split-viewing is an information gap activity where the class is split into groups with one group facing the screen and the other with their back to the screen. The ones facing the screen than report on what they have seen - this can be done WHILE as well as AFTER they watch. Alternatively, students who are not watching (the ones sitting with their backs to the screen) can be send out of the classroom and come up with a list of the questions to ask the 'watching group'. This works particularly well with action or crime scenes with ...