The highest point for any tech company isn’t when it launches its first product, or closes a big round of funding. It’s the moment when the company name makes that magical transition from noun to verb.
- Think about when “Have you heard of Uber?” became “I’m Ubering home.”
- Or, when “Look it up on Google” became “Google it.” That is true success.
It’s easy to assume that companies that have completed the transition from noun to verb are untouchable — that they will never be toppled from their position of market dominance. But this isn’t always the case. Uber’s market share has shrunk by nearly 20 percent since 2014 — a fall that can be attributed more to corporate scandals and declining public opinion than any flaw in its product.
Companies that enjoy near-monopolistic power over their market become vulnerable when their reputation begins to decline. This is precisely why the search engine market is poised for disruption. Google — which today enjoys a cool 75 percent share of the search engine market — has seen its reputation shaken over the past year by a series of issues. And this should not be a surprise: Search is not a solved problem; there’s still plenty of room for innovation and disruption.
Here are three reasons why the search industry is ripe for disruption.
Filter bubbles are the byproduct of social algorithms utilized by Google and most other search platforms. When a user enters a term into a search engine, the engine’s algorithms take into account what they know about that user’s preferences and interests based on his or her online activity. The algorithms then prioritize search results that are tailored to that particular user’s preferences.
The result? Conservatives are fed conservative content, liberals are fed liberal content, millennials are fed millennial content and, in general, everyone lives in a personalized bubble where his or her own ideas and prejudices are continually reinforced and never challenged.
The term “filter bubble” was coined back in 2011 by Eli Pariser, chief executive of Upworthy and board president of MoveOn.org. “Even if you’re logged out,” said Pariser in his now-famous TED Talk, “there are 57 signals that Google looks at — everything from what kind of computer you’re on to what kind of browser you’re using, to where you’re located — that it uses to personally tailor your query results.”
Moreover, the relevance of these “filter bubbles” has increased over the past 12 months as their dangers have begun to manifest more seriously in public life. Leading figures like Bill Gates and Angela Merkel have spoken up about the problem, which has been blamed for increasing political polarization and for harming civic conversation in democratic countries.
Sanjay Arora, founder and CEO of Million Short, a customizable search engine, believes, however, that filter bubbles can be popped if people are given more options and control over how they search for information online.
“Why do we let a tiny number of companies control the information that we access on the internet?” Arora asked during a phone interview we had when I was prepping for my new podcast (In The Trenches with Andrew Medal). “The world,” Arora said, “needs more search engine options.”
Rather than feeding results from opaque algorithms, Million Short allows users to tailor its search results, he pointed out. This happens through a variety of custom filters, and through the option the viewer has to actually filter out the top 100 to 1,000,000 sites on the internet.
The results can be eye-opening for the user, Arora claimed, noting: “Every day, we get feedback from users telling us how Million Short allows them to discover content online that they couldn’t find with traditional search engines.”
“Fake news” was one of the highest-profile social issues of 2017. Malicious groups, including political extremists and Russian hackers, used the internet’s biggest platforms to circulate untrue stories in order to stoked partisan conflict and influence national elections.
Google’s search algorithms carry much of the blame for the widespread circulation of those stories. At various points last year, Google actually displayed a neo-Nazi Holocaust denial website at the top of its search results, promoted fake news tweets that mischaracterized the Texas church shooter as a Muslim extremist and suggested that President Obama was planning a communist coup de’etat.
The real problem is that Google’s algorithms are vulnerable: They can be taken advantage of by fake news propagators. “I think the reason fake news ranks is the same reason why it shows up in Google’s autocomplete,” said Joost De Valk, founder of Yoast. “They’ve been taking more and more user signals into their algorithm.
“If something resonates with users, it’s more likely to get clicks,” De Valk continued, “Google is quick to promote you to number one . . . Nothing in their algorithm checks facts.”
Of course, this isn’t an easy problem to solve. Even if algorithms get better at separating fact from fiction, the humans that design those algorithms bring their own biases into the equation. Are we comfortable giving one company the authority to make decisions about what news is worthy of distribution? “They’re really skating on this ice,” said Michael Bertini, a search strategist at marketing agency iQuanti, “They’re controlling what users see. If Google is controlling what they deem to be fake news, I think that’s bias.”
So, who’s out there working on this? Any company that can solve this puzzle is certain to create a serious buzz, considering that the issue of fake news continues to be a hot topic for 2018.
Data security concerns
Fake news isn’t the only issue: As digital platforms like Google become increasingly ubiquitous and sophisticated, concerns about data security and privacy are also on the rise.
Google has been criticized for the various ways that it tracks and stores information about online behavior. And the search giant hasn’t exactly been apologetic about this practice. Its former CEO Eric Schmidt once said, “If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place. If you really need that kind of privacy, the reality is that search engines — including Google — do retain this information for some time.”
Other search platforms disagree: They’ve rejected the inevitability of user data tracking. Duck Duck Go, for example, built its entire platform around the idea that search engines shouldn’t track or store user data. This principled stance has won Duck Duck Go a loyal following and the seventh largest market share in search.
So, these are some of the reasons why search is far from a solved problem. They’re why the number of players in the space — Quora, Million Short, Duck Duck Go and the revamped Ask.com, to name a few — is increasing.
There are still serious challenges that need to be solved, and bold entrepreneurs are out there trying to solve them, taking advantage of opportunties for innovation that Google seems unable or unwilling to pursue.