As machine learning and neural networks make results more useful and personalised, we take a look at how smart Search has become – and how brands should prepare for the future.
From day to day, the shifts in Search can be tricky to spot. Maybe you remember the first day you saw autocomplete working, or the time you realised you could type a query directly into Chrome’s address bar – or maybe not – but not all progress in Search is that glaring. In Search, most changes are subtle: results arrive a fraction faster, become a touch more personalised, a tiny bit more relevant to your interests. And then, somehow, two decades after you first told Google you were Feeling Lucky, you’re working with an engine that’s vastly more powerful than those early efforts, even if in many ways it’s broadly the same on the surface.
“It’s easiest to think of progress in Search like progress in Formula One racing,” says Kevin Mathers, Google UK sales director. “The tracks have been basically unchanged for decades, and the cars look pretty much the same, but everything under the hood has improved an insane amount.” The analogy doesn’t end there: behind the scenes, engineers constantly make thousands of tiny adjustments to the system, tweaking and testing to compound success.
“Using data to refine results is one reason Search results are getting better even as the volume of data around the world continues to explode”
– Andrew McAfee, co-director at The MIT Initiative on the Digital Economy
Machine learning (ML) – the subset of artificial intelligence (AI) currently helping to optimise Search, alongside everything from world-beating chess computers to face and speech recognition – is key to many of the most recent leaps in the field. For everyday users, it can make results more personal and useful, using everything from previous Searches to location to fine-tune results. For brands and businesses, it allows new ways of connecting with customers, by spotting behaviour patterns that would be all but impossible to tease manually from the available data.
“Humans are good at finding patterns in three dimensions plus time – it’s why we’re so good at catching balls, for instance. But if I asked you to find patterns in hundreds of different dimensions – parameters like age, locations, likes, and so on – you’d struggle,” explains Daniel Hulme, CEO of London-based AI solutions company Satalia. “Machine learning is good at finding patterns in data. You give your system lots of examples to train on, and eventually it gets really smart.” Until recently, this was tough to do because of the sheer volume of data involved, but it’s getting easier and more efficient. “One of the key things is we got better at applying it and at iterating through the development process,” explains Google Machine Learning Specialist Erwin Huizenga. “A lot of the value is not in adding features that weren’t there before, but in improving accuracy and performance – subtle, small things that normal users often don’t notice.”
For example, consider the way Google has become better at responding to even the vaguest of queries: once upon a time, a question like “Why does my TV look strange?” would have been enough to confuse most search engines, but now, thanks to a technique known as neural matching, fuzzy queries like this can be parsed into terms that Search can understand without resorting to keywords (FYI, in this case it’s the so-called Soap Opera Effect, which is linked to a form of motion-smoothing that’s on by default in many modern TVs). Search is constantly getting better at understanding synonyms (think about all the different ways you’d use the word “change”) in the way that we do naturally – and as it gets smarter at human-level tasks, it’s able to effortlessly do the stuff we’re terrible at. RankBrain, the AI system introduced as part of Google Search in 2015, works by monitoring the semantics of user queries – and users’ behaviour when they’re presented with results.
“Companies that do Search best ultimately win in every vertical”
– Kevin Mathers, Sales Director at Google UK
“Search is a case where you can look at the kind of thing people said they were looking for and then look at what they actually clicked on,” explains Andrew McAfee, co-director of the MIT Initiative on the Digital Economy. “Using that data to refine future results is one reason that Search results are getting better even as the volume of data around the world continues to explode.” Improvements in natural language processing, for instance, will help our Google Assistants – the new voice of Search on many devices – have conversations with us, learning from experience to properly parse a sentence like “I fancy a weekend in the country” and cross-referencing that with our diaries and preferences to offer up a list of suggested destinations.
But there’s more to ML than simply making a computer that can hold a conversation or understand simple queries. In fields where there are too many signals for mere humans to comprehend – which can include everything from predicting natural disasters to amplifying the effect of advertising – improving technology can cut through the complexity, refining the way it works to achieve superhuman levels of pattern recognition. Or, to put it another way: “We don’t have to tell computers the rules and strategies any more,” explains McAfee. “That’s the old approach. Now we can just give them a bunch of properly labelled examples, and say ‘OK, you figure out the patterns for yourself here, we’re not going to point them out.’”
In ad auctions, for instance, the world of average bids has all but disappeared. “Machine learning allows us to use data from Google, from the user, and from the advertiser to make real-time decisions on what the right bid is for that user at that time,” explains Gemma Howley, Google automation lead for the UK and Ireland. “People are searching at different times of day from different devices and locations, all with different relationships to the brand – and we can make real-time informed decisions based on that context, for every auction.”
ML can also automate the creation of the ads themselves, replacing traditional split-testing with custom-created Responsive Search Ads, currently in beta testing, that respond to each consumer’s likely interests. “Rather than write full ads we ask advertisers to give us a bunch of assets – five different titles, 10 different descriptions – that allow us to look at the data and pick the right bits,” says Howley. “So you’re giving us different ways to put across your USPs or your prices, and you’re showing the most relevant possible ad in the best possible position.” You’ll also be able to feed back all of this into future campaigns, via improved Data-Driven Attribution. “Before, DDA was something that was nice to know, but which you couldn’t do much with,” says Howley. “Now, it all goes back into the bidding algorithms, so as more comes in the better it gets.” And though Google’s tools can improve clicks without any intervention, it’s still possible to get feedback on what’s working and act accordingly. The new Optimisation score uses machine learning to simplify a huge amount of data, including trends in the ads ecosystem, into recommended actions – with uplift predicted in percentage points.
The benefits are undeniable. The speed at which ML can move and the volume of data it can process mean that it’s increasingly the most efficient way of doing things. “Companies that do Search best ultimately win in every vertical,” says Mathers.
How is this going to change the role Search plays in growing or transforming businesses? It’s likely to mean that companies can spend less time obsessing over small details and more time concentrating on the bigger picture. As we get more used to algorithms making everyday decisions, we’re likely to trust them more, and give them more responsibility – if you’ll trust your life to a self-driving car, for instance, trusting your Êad spend to an algorithm isn’t too much of a stretch. But, assuming advertisers don’t need to take a hand in these recommendations, where do humans fit in? By building the best company possible, and letting it shine. “I certainly don’t think there’s any way of gaming Search any more,” says Hulme. “You have to be creating products that work. You should have a purpose, and put all your efforts into creating the best company that you can.” As Search continues to improve, it’ll only get better at directing consumers to the sorts of businesses that will serve their needs best – so focusing on becoming one of those businesses is the only smart move.
Why Visual Search will change the way we look at the world
Search is great if you can articulate what you’re looking for - but what if you can’t? Visual search, as enabled by Google Lens, offers something completely new: the ability to take inspiration (and information) from our surroundings. “We’ve all had the feeling of ‘I know what I want but I won’t be able to tell you until I see it,” says digital strategist Clark Boyd. “With Google Lens, I can look at an armchair, and have Google tell me not only that it’s a 16th-century Scandinavian make and find places that I can buy one, but even find other pieces of decor that might be complementary to it. The value of that is absolutely huge.”
Google Lens is ever-improving: it can already assist you on a self-guided nature walk by recognising the flowers that you pass, or showing you reviews of a cafe you stroll past. It’ll help you find similar pairs of shoes to the ones your friends are wearing, or identify a street artist from one of their works. While fashion and beauty – where visuals are all-important – are driving the technology, it’s only going to become more important in food, travel and beyond. And while consumers are ever-ready to snap photos of dogs and desserts, emerging tech trends could prompt another sea change in behaviour. “When augmented reality starts to actually move into the mainstream, the UX becomes much easier,” says Marc Ferrentino, chief strategy officer at digital knowledge management platform Yext. “Brands should be structuring their information now, so that they’re ready for the move.” When consumers can search anything they can see, it’ll be worth standing out.
How voice recognition makes saying “Yes” easier
Humans have been writing, it’s generally agreed, for just over 5,000 years – and now, in what barely counts as an evolutionary eyeblink, we’ve become able to Search via our oldest means of communication: voice. What does that mean for brands? It all starts with how voice changes behaviour. At the most basic level, for instance, people might interact with Search differently when they’re speaking rather than typing. “We’re more natural in the way we structure our requests,” says Sina Kahen, co-founder of voice strategy start-up Vaice. “SEO-wise, you’re going to benefit from an online presence that’s complementary to how people put questions to their Google Assistants.”
Going beyond simple searching, Google Assistant also offers Actions – specific brand-linked tasks which offer new opportunities to engage with consumers, when they’re done right. “The Argos Action, for instance, works well when you know exactly what you need,” says John Campbell, founder of voice experience agency Rabbit & Pork. “Other Actions work well for repeatable actions – re-ordering food, for instance, or checking into flights online.”
“The key isn’t looking for use cases directly related to your product, but finding ones that match up with what your product is trying to do,” says Kahen. A fitness brand, for instance, might aim to help users construct their own bespoke workout – then use that to direct them towards a gym where they can carry it out.
Perhaps the most important point to remember about voice is: nothing else about your brand goes away. Your web presence and mobile experience will still be there, and each of these channels will offer a specific style of experience. That means having a strong brand identity – so that you’re ready for whatever comes next.