What will replace Google search?

#1

Obviously, Google has a million products, but search is the one I use the most, and I’d like to see alternatives, for a few reasons:

  1. I don’t personally like their privacy practices, or the lack of perceived privacy when using Google. In particular, I don’t like my data being used against me, in advertising, to filter my results, or otherwise.
  2. I don’t like being in a “filter bubble”. Show me stuff that I disagree with! Show me stuff I haven’t seen before! Get me out of my echo chamber.
  3. Competition is good. It’ll force Google to up their game, if nothing else.

Unfortunately, I haven’t seen any solid, usable alternatives, and starting a search company now is like starting a real estate business: you have to have money and resources to start, it seems (correct me if I’m wrong, please!).

What do you want to see in a Google search alternative? What would you build? How would you rank pages? The more details, the better! Let’s talk about a search revolution!

5 Likes

#2

There are some options here: The Best Private Search Engines — Alternatives to Google.

Right now, Google has the best tech and the best brand. Their brand is so sticky it has created verbs in the dictionary:


But Google privacy practices will probably catch up to them, I think competition could force them to improve here. Search gives the search engine so much info about intent, there is no need to personalize ads imho. Ads could be 100% contextual to what is searched. DuckDuckGo CEO has a good quora answer about this.

Our device uage will also greatly impact how the search market changes. From desktop to phone to audio, the dominate search integration can gain market share.

3 Likes

#3

Very true, there are alternatives. They just kind of suck. I actually used DuckDuckGo for about a year, and made a point to use the “!g” Google shortcut when their engine failed (which was fairly often, about half the time). Google definitely has the best search engine on the market right now, which is what prompted me to switch back.

I’ve had to consciously make an effort to tell people “search X” as opposed to “Google X”. It’s one of the stickiest brands I can think of.

I agree, ads don’t have to be personalized in this way. DDG did ads well, without any tracking of user behavior. For example, if I searched “hotels”, I would get ads for Expedia, Travelocity, etc. I think tracking has a deeper purpose than advertising alone, but that’s another story. :wink: Perhaps an HN story in the near future…but I’ve already got 2 in the queue that I’m working on, so it’ll be a bit.

In the end, I’m hoping that things like GDPR will force Google to better themselves on the privacy front. I’m hoping that competitors, real, useful competitors, will arise to give them a run for their money. But honestly, I don’t see it happening unless there’s some sort of antitrust lawsuit against them (which may happen!).

I’m not saying I’m in favor of an antitrust suit against Google, or regulation of tech companies in general, as these regulations are typically written without any knowledge of the underlying systems and how they work. I’m just trying to be optimistic about the 800-pound gorilla that is Google. I don’t like any one company or entity to have too much power, in general.

2 Likes

#4

I agree. I think Google would too:

The world’s information includes information about you and everyone else.

My current solution is to use Startpage combined with Firefox and privacy enhancing extensions:

I’m satisfied with the results I get. I switched from DDG also due to frustration with the quality of results. I don’t think I’m 100% immune to their tracking - I’m guessing I’ll have quite a unique fingerprint - but it’s better than the default.


I like the question that you used as the topic title. I don’t have an answer, but I’ve recently been exploring RDF, linked data and the semantic web and I think it holds a lot of potential. Although it seems like it’s never really migrated from the academic to the business realm.

With the semantic web, it will be easier to create smarter applications and the “smartness” gap between Google and other search providers might become smaller.

But maybe this is just wishful thinking :sweat_smile:

2 Likes

#5

I was just about to suggest the same thing :grin:, “!g” was an ok solution.

I believe that if a search engine wants to really compete with google it should also have a unified service to offer. It is really convenient to have all (or most) in one account, email, cloud etc. On the other hand, maintaining your privacy is never easy, but someone that would be willing to find a business model to offer all the above without selling people’s privacy (at least not at this level) would bring a revolution.

2 Likes

#6

Thanks for the link! I hadn’t looked up Google’s mission statement or anything, so I wasn’t aware of this. I’m not a conspiracy theorist or anything – it’s highly likely that my information is just sitting on Google’s servers somewhere, and no one cares about it except for advertisers. Still, that doesn’t sit well with me, and as I’ve mentioned before, I don’t like the consolidation of power in any one place.

Thanks for the recommendations! I currently use Firefox with uBlock Origin and Privacy Badger (I’d highly recommend both). Firefox has really upped their game in the past couple years – I’m very much looking forward to more Rust-powered improvements. I just wish they hadn’t taken away tab groups…that was a killer feature.

That would be a dream come true :slight_smile: Unfortunately, I wouldn’t say it’s very feasible, given that most of the web doesn’t implement RDF, or any semantic standards. The web is a mess.

That said, I don’t think backwards compatibility is 100% necessary to move the web forward. Maybe we need something new, something different. Maybe the web has accrued too much cruft, and needs to be retired in favor of something more modern, machine-readable, private, and secure. I still have sketches and notes from when I was 17 or 18, detailing my dream system that would replace the web…maybe I’ll clean them up, type them out, and share them here sometime. :slight_smile:

2 Likes

#7

I’d have to disagree with this. While it would be convenient to have all your web services in one place, I believe in tools that do one thing and do it well, and I don’t think consolidation of information and power in one entity is something to strive for. I’d much rather have a wide array of services to choose from. But that’s just my 2 cents. :slight_smile: Different strokes for different folks.

What if these separate services could operate on a common, open protocol that allows them to communicate with each other (say, for single sign-on)?

Any ideas on what that business model might look like?

1 Like

#8

I’m not sure it would make business sense for Google to do this but
I think Google could squash its competition even more by creating a secondary search tool that doesn’t require users logging in, doesn’t track user searches, and at least adds some anonymization. It may seem like a step backwards as it would probably remove some of the amazing contextual abilities that Google has. It’s these context clues that Google uses that make it so great compared to other search engines.

If Google were to offer a privacy focused product…even if it does so in a way that isn’t 100% transparent, it could probably get some business back from DuckDuckGo and other private search engines.

But the people that are willing to use other search engines would also go to the trouble of verifying that this “new private search engine” is actually owned by Google – and as a result, not using it. Also, before anyone switches from Google, someone needs to answer the question: Why Does Using a Private Search Engine Matter?

2 Likes

#9

Great question. I believe the days of a “one search fits all” are gone. The types of queries I used to rely on Google have gone down over the years. Here are the 2 most common queries I make on a daily basis that aren’t powered by Google.

  1. Any query where recency is more important than relevance. I find that making these queries on a social site such as Twitter just makes a lot more sense to me and the results are often far better. An example of this is getting the latest on the Mueller investigation.

  2. Apps are getting a lot better at providing their own search for queries related to their service. Algolia makes it easier than ever for a service to provide a personalized search experience on a smaller corpus. So at this point, I’m far more likely to use app search to find API documentation than doing something like site:app.com <query> in Google.

Google is still a fantastic tool but I hope search continues to fragment and become more personalized. That future is far more appealing to me than a new behemoth replacement.

1 Like

#10

Interesting insight! I could see more specialized search engines becoming the norm (Wolfram Alpha comes to mind). I’d much rather see specialized search engines, especially those that make assumptions related to the content and query, than yet another monolith.

When I say the search engine should make assumptions, I mean it should assume certain queries are error messages, or assume that “node” means Node.js in certain contexts (if you couldn’t tell, I’ve been thinking about a search engine for programmers lately :wink:). If a search engine is specialized, it can use assumptions like these to give you extremely relevant results. Google tries to do this with a “filter bubble”, but I don’t like the idea of missing out on content because Google thinks I fit into a given demographic or something. I’d much rather navigate to a programming-specific search engine, for example, and know which assumptions are being made for me.

2 Likes

#11

This is an excellent question to ask, and it’s the right question to ask, because let’s face it: no one outside of tech is going to switch search engines because of privacy, barring some earth-shattering revelations. Even then, most people don’t think they have anything to hide, so why worry about privacy?

I think in order for a competitor to beat Google at search, they have to give better results. I especially like @Dane’s idea of specialized search engines. If there was a programming-specific search engine, or a search engine just for gifs, I believe they could give results an order of magnitude better because they could make assumptions about the query that Google can’t do without putting you in a filter bubble.

This raises an interesting question: how would you consolidate these separate search engines into a page that is comparable with Google? No one (again, no one outside of tech) is going to use 10 different search engines. People are used to typing and searching, in the address bar, simple as that. How do we replicate that experience without sacrificing the specialization aspect? DuckDuckGo’s “!bangs” could be one option, although a bit heavy-handed…maybe the landing page is a list of topics you’ve searched recently, and clicking one “activates” it and directs your search to that engine? I don’t know the answer, exactly. But I’d love to hear ideas!

0 Likes

#12

I really don’t think consolidation is necessary. Sure people have some pretty ingrained habits when it comes to search but that doesn’t mean behavior can’t change over time.

Think about the people you know in real life. Do you ever think Awww it's too much work talking to all these people. Why can't I just combine all of my friends into a single person...? That probably seems like a crazy idea for real life relationships but for some reason, we keep thinking that technology should all ultimately occupy the same mental space.

I think the technology that we bring into our lives is somewhat analogous to our human relationships. As we invest time into these problem solving tools, it’s ok that they occupy different mental spaces. It’s actually easier to compartmentalize functions when they have different visual identities (brands).

It could be unhealthy if people start completely replacing their human connections with artificial connections. But human connections are not necessarily more meaningful…

2 Likes

#14

Full disclosure: I work on the Search team at Google. This post is purely a piece of my opinion and doesn’t necessarily reflect the views of my employer.

I think the idea of specialized search is an interesting one. It gets to be a bit tricky when you deal with queries that are inherently cross-domain. In a more perfect world, I’d love to see something like the Semantic Web, but what I suspect will happen is machine learning will act as a retrofit or shim to the plaintext web and as a necessary bridge to a multimedia web.

Extending from that, I would love to see on-device machine intelligence systems that understand query intents, scrape the web, and parse results for human consumption.

Remember when the web was first starting to grow and you found results because other people would share links on message boards? Or people would start up linking circles to share traffic with each other? Farther back still, do you remember talking to a librarian about a research topic and getting help from them? I can see intelligence systems replicating and replacing this, ideally with more democratized indices as starting points. Instead of guessing at keywords, being able to say, “I’m getting a new pet python and would like to know about their nesting requirements.” The system would disambiguate Python (the programming language) from python (the animal), decide that it needed evergreen information (i.e., recency is not as important), search from a few good starting points or guesses, read into those, find additional references from that, do the research for you, and summarize the results. Perhaps if it’s smart enough it could recognize changes in litigation (a very different topic domain) and say, “Be careful, due to recent changes in public policy, python ownership is now restricted where you live. If you want to build a container for a python, though, you can consider these pages and these excerpts.”

It’s a future I’d really like to see. Benevolent AI with a democratized, decentralized web. Maybe that’s a pipe dream, but maybe we can get there if we’re careful with our choices as consumers, diligent with our politics, and lucky.

7 Likes

#15

Thanks for sharing Joseph! The librarian comparison is a fascinating one…what I’d really like to have is the “Librarian” program from Snow Crash, by Neal Stephenson. The future you describe is definitely the ideal: a decentralized, democratic web with benevolent AI. On-device intelligence systems are interesting too – do you mean systems that essentially work offline? How do we avoid malevolent AI?

Ultimately, I like the way you’re thinking about gathering information, and how it used to work on message boards and such. This post gives me hope for the future. :slight_smile:

1 Like

#16

Ah, offline in this context means, “resident on hardware that a single person owns, unobstructed by DRM.” I’m just slightly fearful of people no longer owning their own devices or being fully reliant on cloud resources. (Edit: I think it’s fine if the AI connects to the web. I just think the processing should be done locally if at all possible.)

I think the threat of malevolent AI is very overblown, especially when compared with what I consider to be the real threat: wealth disparity and automation. Think back to the industrial revolution. People working 16-hour days 6 days per week. They toiled on the machines owned by the aristocracy, starving amidst abundance. I’m fearful that we’ll see companies use AI as a tool of subjugation and oppression, manipulating people and playing against our interests via impossibly calculated twists and pokes and jabs on our society. That’s more of a threat that malevolent AI for the a long way into the future. I think that’s a separate conversation and I’m not sure I’m wholly qualified to speak on it. (I worked as an ML researcher for a few years and wrote some libraries, but that’s about it.)

4 Likes

#17

Gotcha, thanks for clarifying. I agree, having software that works offline (in your sense of the word) is important to the future of the internet.

I couldn’t agree more. I’m much more worried about malevolent people than some Terminator-like scenario. The social credit system in China comes to mind…

I’m not 100% sure what the solution is to wealth disparity and issues arising from automation, but I think it’ll involve some form of universal basic income. I’ll start a new thread on the topic of automation, now that you mention it.

2 Likes

#18

I would pay for a search engine that is less business friendly in the sense of vulnerability to SEO, doesn’t use its own leadership’s subjective assessment of credibility to populate the first page of results on controversial matters, and does not customize results to its assessment of what I want without transparency and consent. I’m afraid the next or maybe current horizon is going to be results that are generated uniquely for the user, and perhaps with intent to manipulate the user. This would be less likely if Google had real competition.

1 Like

#19

Do you think we’ve been trained to write queries that are stripped of context? If I were to write that query I’d probably search for “python nesting” which is possible to disambiguate but the challenge is much more difficult than a full sentence.

Maybe the problem here is we treat our queries as disposable things that give you an answer and go away. Wouldn’t it be more helpful if we could write more thoughtful queries and “subscribe” to the results for a period of time? This way the current and near future queries can all benefit from the additional context.

Using behavior as context is probably an easier solution in a lot of cases but people are generally paranoid and feel uneasy about being “spied” on. Maybe a more explicit subscription model could be a nice middle ground for privacy junkies.

1 Like

#20

Do you think we’ve been trained to write queries that are stripped of context? If I were to write that query I’d probably search for “python nesting” which is possible to disambiguate but the challenge is much more difficult than a full sentence.

Possibly! Though even if we’ve been trained to query in a specific way, there’s still context that can be relevant to us that we don’t encode in our search queries. Geographic location, for example, can and should be used to tailor content that’s relevant to users. Example: for people in the Indian state of Goa, it’s okay to deliver Marathi or Hindi content. For people in Delhi, though, you don’t want to serve Marathi content. Local idiom also comes into play when disambiguating intent. “Local Chemist” is a different intent in the UK vs the US.

Maybe the problem here is we treat our queries as disposable things that give you an answer and go away. Wouldn’t it be more helpful if we could write more thoughtful queries and “subscribe” to the results for a period of time? This way the current and near future queries can all benefit from the additional context.

This is something a few companies are trying. Google calls it ‘proactive search’. If you go to the Google home page on your mobile device you’ll see recommended content based on your user archetype. Google Discover/Now also shows this content. Example: https://i.imgur.com/P3y2fsq.png

I hear you about privacy concerns. It’s a tricky situation. People are quite averse to paying tor most anything, but they also don’t want to be the product. One shouldn’t need to pay for privacy, but there’s really not much else that we can passively monetize unless people decide they want companies to start mining bitcoin on their devices. I wish there were a good solution for this.

1 Like

#21

Oh yeah. I forgot about Google Now. I’ve tried using it a few times. It’s a nice idea in principle but something is just missing for me. The last time I tried it, I essentially ended up getting sports scores and help optimizing my commute.

The sports scores weren’t helpful because while Google was able to discover most of the teams I follow, it wasn’t smart enough to understand that I usually don’t care about seeing score updates. If I’m very into a team then I’m probably watching the game live or following along with a more dedicated tool. So a score update is too superficial. I’m not sure how Google would discover this but what I’m really interested in are prop bets, statistical anomalies, and fantasy stats.

The commute info was also not helpful because, while I live in SF, I walked to work at the time. So traffic congestion didn’t really matter to me.

But what I really wanted from Google Now is a much deeper understanding about the things I care about professionally. Here are a few sample notifications I’d love to see:

  • It looks like you are starting to learn about GraphQL. There is a Meetup 4 blocks away from your office tomorrow from 5-7. Would you like to RSVP?
  • You seem to be struggling with Javascript promises. Here is a new 10 minute video that takes a deep dive on the topic. Would you like to add it to your watch list?
  • A local dev conference is looking for a speaker to give a 5m talk on rapid prototyping next month. Are you interested?

All hard problems to figure out…and when the algorithm gets it wrong trust is diminished and you risk losing access to the user’s attention. I’ll give Google Now another shot though. I’m excited to see improvements since my last attempt.

1 Like