From search to execution: how AI is replacing the interface itself

From online shopping to business workflows, this blog explores how tools like ChatGPT are transforming digital interactions—not just answering queries, but understanding intent and taking actions. Will UX need a search box in the future?

Reading Time4 minutes

Years ago, I was involved in building one of the first central European mobile shopping apps for a large retail chain. One of the biggest challenges was how to design the UI so that users find what they’re looking for as easily as possible.

It may seem straightforward, but consider products such as “cheese” or “pencil” - both can fall into multiple categories. “Cheese”, depending on the kind, in delicatessen, dairy or spreads, “pencil” in school supplies or stationery… Going through menus with categories, subcategories, and tags to find the right variety of cheese was, surprisingly so, almost more painful than walking by foot to that familiar open chiller in the left corner of the local supermarket. 

The solution to the problem was obvious. The search bar will help users find the right products. But what if in that critical mobile shopping moment, you’re just blanking on the name for that soft, stretchy Italian cheese… you’ll still have to fight through rows and rows of edamers, emmentalers, and cottage cheeses before you finally find it. 

We conducted an UX usability test with the users on the app and one of the users, when challenged to find a product based on the description, said: “I wish I could just type a description into the search box when I can’t remember the name of something, like that Italian cheese, and the app would know what I am looking for. Now that would be a feature.”

Intents known and unknown

No menus, checkboxes, or scrolling. Just a description, and you get the right hit right away. 

Back then, it sounded radical, but today we’re pretty familiar with that kind of UI. ChatGPT, we are looking at you. 

Given enough context, LLMs have already proven to be the “answer whatever the question” service we’re looking for. So we know today that if we ask, “What is the name of that soft, white Italian cheese, the one that’s stretchy when melted and used on pizza?”, it would readily reply, “The cheese you're thinking of is mozzarella.” But, even before LLMs, one novelty started to shape users’ expectations of digital experiences. And that is – we started to expect from the apps and services to know what we need, whatever the “game”:

  • In the “attention game” – that’s your YouTube or Netflix feed displaying the content curated for you without any searching. 
  • In the “transaction game” – that’s when a webshop welcomes you with clothing items in the right size and style preferences without the need to go through menus and categories. 
  • In the “productivity game” – the home screen of the productivity suite knows that on Fridays, you’re working on that report, so on Fridays, it’s proverbially pushed to the top of the document list… 


So, we are already used to using the search rarely, rarer than ever before. But when we do use it, it still can be frustrating simply because you are just not able to find that jacket you saw that could work well for your upcoming Copenhagen trip. You try filtering by color. However, the shop’s “green” palette encompasses a wide range of colors, from sage to lime to teal. You try sorting by “newest,” “best sellers,” “jackets”... but you're scrolling aimlessly. No preview photos match what you remember. Sometimes your intent seems to be out of reach even for the most advanced algorithms or the most elaborate search tools. 

AI makes intent-driven search exciting

For a long time, search bars served as the ultimate tool for users to find what they needed. And the entire user experience revolved around that function. But the users’ real intent is rarely to search. It is to purchase, to experience, or to create. 

With the help of AI and machine learning, providers of digital services are trying to go even beyond and to capture users’ intent, obsoleting entirely the need for the search. What's the last time you used the search on YouTube? And with the active help of interpreting the intent, e.g., not just by learning from past behavior, but by actively listening to users in real time, the services are one step nearer to their users. According to their 2025 AI roadmap, Walmart is phasing out traditional search bars entirely. Customers will eventually be able to express high‑level goals, like furnishing a new apartment or creating a grocery plan, and let Sparky, the AI agent, autonomously figure out and handle the tasks needed to accomplish those goals. Users don't have to think in search terms but rather have to express what their intent is, and the AI agent will do the heavy work of comparing, calculating, finding, and presenting what best meets that need. 

But if operating only within a “walled garden” of a certain store, the store's AI agent still cannot help you with the things that are outside of those walls. Imagine an AI agent that you can prompt with the following:  “Plan a 3-day business trip to Copenhagen on the 15th with a hotel near the conference venue Janice mentioned in her last email. Check with her whether my participation fee is cleared. Prepare a ‘lessons learned’ presentation using our last year’s initiatives slides and notes from that TED talk on sales I watched recently. I’ll need a seasonal jacket for the trip, and plan some time for sightseeing…”, and your personal agent takes it from there, as the real assistant would, making contextual decisions and asking questions if unsure. You don’t even need to visit any app, but communicate via WhatsApp. Remember those scenes from the movies where businessmen have assistants to help them through preparations like this. In the current day and age, we are witnessing the emergence of virtual agents that “behave” exactly like that: not just answering your questions, but taking initiative, understanding even vague instructions, and executing entire workflows across multiple services. 

One such agent is a ChatGPT Agent that works by taking a high-level user request, written in natural language, and breaking it down into specific tasks that it can complete across multiple tools and services. It utilizes reasoning and memory to comprehend context, accesses external data via APIs or the web, performs calculations or summarization as needed, and leverages its own virtual computers to interact with other apps or UIs, enabling actions such as booking, drafting, or organizing. Instead of just providing “search results” or even answering questions, it acts like an intelligent assistant that understands your goals and works toward them autonomously.

Curious how an agent like this could streamline your operations? Let’s talk – we’ll show you what’s possible today.