Use AI to Search, But Don’t Stop Thinking: The Risks of AI-Driven Convenience
How Apple’s New “Search the Way You Talk” Feature Highlights the Convenience and Risks of Natural Language Search
Apple recently unveiled a feature in its App Store inviting users to “search the way you talk.” It’s part of their broader embrace of natural language search, a tool designed to feel as intuitive as talking with a friend.
Need a new fitness app? Just type, “Apps that help me work out,” and let the algorithms do their thing. It’s meant to be easy and accessible. There is no need to strategically search keywords or guess what the search engine wants from you.
But here’s what I can’t help but worry about…
As search becomes more conversational, we may be unwittingly giving up something valuable. The ability to craft effective searches using precise keywords or logical operators might fade into obscurity, much like the once-essential skill of navigating a library. And while it’s easy to shrug off this shift as progress, it raises a critical question: what happens when the tech fails? When the internet’s down, the power’s out, or the AI model misunderstands your request, will we remember how to navigate the analog world?
The Encyclopedia Syndrome
Remember encyclopedias? If you do, you probably also remember the mini sense of accomplishment that came with finding exactly what you needed in those meticulously indexed volumes. Before the internet, research required not only effort but a different level of skill. You couldn’t just type, “What were the top fashion trends of 1845?” and get instant gratification. Instead, you had to deduce: What subject might this fall under? Which volume should I check? It was a slower, more manual process that demanded critical thinking every step of the way.
Search engines democratized access to information but, in doing so, also reshaped how we think about finding it. We went from active participants in the quest for knowledge to more passive recipients. Natural language search takes that a step further by removing the need for precision altogether.
If you use a GPS you might get to your destination faster, but if the signal drops, could you still read a map?
It does seem like natural language search is a win for accessibility. It lowers barriers for people who find traditional search frustrating. But I’m afraid it might introduce a creeping dependency. If this becomes the norm, future generations might never learn the mechanics of searching effectively.
When Convenience Backfires
Apple’s natural language updates are a microcosm of this shift. The system works—sort of… Wes Davis tested the new search feature typing “emulators that feature multiple consoles,” which returned Delta, a solid match. But when he asked for “Video games that can help me work out,” he got suggestions like Twerk Race 3D. These types of results make you realize how much context is still missing.
It’s tempting to laugh off these failures, but they highlight a deeper issue: reliance on algorithms that don’t always understand us. Today, it’s an App Store glitch. Tomorrow, it could be a critical misunderstanding in medical diagnostics or emergency response. And when we look at how companies like Google are rushing out AI products, the stakes become even more unnerving.
Google’s AI-powered “answers” are highlighted in the UI to appear as the very first result in a search. It sounds magical: you type in a question, and voilà! The answer is right there, bold and confident, so you don’t even have to scroll. But those results are often wrong, or at the very least, wildly misleading.
The problem isn’t just bad answers, it’s the culture driving them. We’re in the midst of an AI arms race, with tech companies rushing to slap the “AI-powered” sticker on anything that moves. Google, Apple, Microsoft, you name it, they’re all falling over themselves to appear innovative.
But innovation without care is just chaos in a shiny wrapper.
Google, in particular, seems so desperate to stay ahead that it's been pushing features into production before they’re ready. Remember Bard, their much-hyped chatbot? It made headlines for giving false information during its debut demonstration. Ouch. If they can’t trust their AI enough for a staged event, why are we trusting it to run the internet’s primary gateway to knowledge?
This isn’t to say AI has no place in search, it can be an incredible tool when paired with human oversight. But we’re not there yet, and the industry’s rush to commercialize AI is creating more problems than solutions.
Right now, it feels like we’re beta-testing features that affect billions of lives.
The Trust Pendulum
As more users become disillusioned with AI search tools, there’s a risk the pendulum swings too far the other way, and people might reject AI altogether, throwing out the baby with the buggy bathwater. This is why companies need to stop emphasizing “first-to-market” and start focusing on reliability and transparency. Clearly tell us when an AI answer is uncertain. Let us toggle between AI suggestions and traditional search.
And most importantly, the onus is on us to critically evaluate the information we’re given. Even when we encounter misleading results, we need the skills to spot inaccuracies and avoid being manipulated. A solid understanding of basic math and statistics can go a long way in helping us navigate the noise.
Without these safeguards, Google’s AI search tools risk undermining the very trust that built the company into a verb.
And for what? A marketing gimmick that wasn’t ready for prime time?
Relearning Old Skills
The solution isn’t to reject progress but to integrate redundancy into our systems and ourselves. Just as we back up important files, we should back up our knowledge. Teach kids how to navigate a library. Encourage workers to practice offline workflows. Normalize keeping a paper map in your glove compartment.
Natural language search is a tool, not a replacement for human critical thinking. If we approach it with that mindset, it can serve as a bridge rather than a crutch. But if we let it, it may become just another layer of dependency in a world that already feels precariously balanced on its technology.
So yes, go ahead and “search the way you talk.” Just don’t forget how to search the way you think.
That’s a skill that will save you when everything else fails.