Do AI Products Even Need Navigation?
Navigation helped us find features. What happens when features find us?
For decades, it’s been the quiet scaffolding of digital life: tabs, sidebars, menus, dashboards. Not just entry points, but the way software showed its shape—revealing what was possible and how to get there. But as AI begins to reshape how we interact with digital products—responding to intent, surfacing tools just-in-time, adapting moment to moment—the familiar landmarks start to dissolve. And when features begin to find you, instead of you finding them, it raises a new kind of design challenge: how do users stay oriented in a world without obvious paths?
When my son was six, he asked for help with a tricky word. Perfect parenting opportunity, I thought, time to introduce him to dictionaries. I grabbed a paperback Oxford English Dictionary we keep on the bookshelf alongside the Gruffalo and other favourites and watched as he flipped the cover, paused, and stared into the sea of tiny black text.
He frowned.
“How do I find the word?”
“It’s alphabetical.” I said.
He squinted at the page.
I smiled. We had work to do.
We sat on the floor, and I explained the system—how it’s organised, how the little letters on the page help you know where you are. As he learned, his hands grew more confident. His frown gave way to curiosity. He started skipping around, checking words just to see what they meant. He had a map now. And with it, the satisfaction of finding his way.
A few days later, I saw him try something else.
“Hey Google,” he asked, “what does conspicuous mean?”
The reply came instantly. Polished. Precise. Just what he needed.
And in its own way, it was wonderful.
No thumbing through pages. No uncertainty. Just… the answer.
This is the promise of intelligent interfaces: frictionless, conversational, responsive to context. You don’t need to know where to go—the system meets you where you are. It anticipates. It adapts. It feels like magic.
But magic has its tradeoffs.
When the path disappears—so can the sense of direction. When tools shape-shift on the fly, users may lose track of what’s possible or how to return to what mattered a moment ago. The experience becomes more fluid, but sometimes that means more slippery.
For decades, software has relied on systems like the dictionary—orderly, spatial, structured. We borrowed metaphors from the physical world: windows, folders, dashboards, tabs. These weren’t just visual tropes; they were cognitive scaffolding. They helped users build a mental map of where things lived and how to get there.
Navigation has been the backbone of the experience. It’s let us move through screens, find tools, and return to where we’d been. But more than that, it has made software legible. A sidebar isn’t just a menu—it’s a promise: here’s what this product can do, and where to find it. Even if you didn’t understand everything yet, you could see the shape of the thing.
This model has shaped how we design, build, and understand software. Code mirrors it. Design systems align to it. Users learn the layout one screen at a time, like rooms in a house. The structure gives them confidence. Predictability. A sense of place.
But that structure is starting to dissolve.
Not because it’s broken—but because AI gives us a different way.
AI disrupts the spatial model not by moving things around, but by removing the need to go looking in the first place. Instead of waiting in a menu, features can emerge in the moment. Tools don’t need to live in fixed locations—they can arrive just-in-time. The interface becomes dynamic, responsive, even conversational. It’s not about where you are in the app anymore—it’s about what you’re trying to do, and what the system thinks might help.
In many ways, it’s a leap forward: more immediate, more personal, more fluid. But something important gets lost when the map disappears.
Because here’s what becoming ever more evident: navigation isn’t just about getting from A to B. It’s how users understand where they are, what is possible, and how they declare what they need. The “Reports” tab isn’t simply about location—it is a signpost: this is something you can do. And selecting it is more than just an act of navigation, it’s a signal of intent: I want to understand something.
In this light, navigation has always been conversational. It was our way of speaking to the product—of saying, this is what I need, this is what I’m here for. It gave voice to our needs, long before our tools could interpret language or anticipate behaviour. And now that interfaces are learning to listen, these signals are more important than ever.
As interfaces evolve, the core function of navigation is brought into clear contrast: help users know where they are, understand what’s possible, and express what they want.
We don’t need to preserve navigation’s form. We need to carry forward its function.
If AI is going to reshape the structure of software—to replace finding features with features that find you—we’ll need new patterns that do what navigation always did: orient, guide, and respond to intent.
Not because we’re nostalgic for menus, but because people still need to feel at home in the systems they use.
Why Navigation Worked
Navigation worked because it made software knowable.
From early GUIs to mobile apps, we organised interfaces like places. Tabs, menus, sidebars—they weren’t just methods of access. They were structure. They told us where things lived. They made complexity feel manageable by carving the experience into named spaces: “Inbox,” “Settings,” “Reports.” Each one a room in the house.
This approach aligned with how we think. Our brains are spatial by default—we remember locations, build maps, rely on landmarks. Traditional navigation let users orient themselves in unfamiliar terrain. You didn’t need to memorise every feature—just remember the general neighbourhood.
But navigation did more than organise. It grounded users in three essential ways:
Orientation – You knew where you were, and how you got there. Breadcrumbs, tabs, and headers made the interface feel stable.
Possibility – A quick scan of the nav bar revealed what the product could do. It gave off an information scent—a sense of which paths lead to what you’re looking for.
Intent – Clicking into a section wasn’t just movement. It was meaning. You were declaring focus, shifting context, setting the agenda.
This scaffolding shaped more than just the interface. It defined how we built and maintained software: information architecture, routing, even component design. Navigation, and the structure it represented, has been the backbone of how users organised their mental model, and how teams organised their code.
What AI Changes
If navigation helps users learn the layout of our software like rooms in a house, AI doesn’t just move the furniture around—it knocks down the walls.
In traditional software, functionality lived in fixed locations. If you wanted a report, you went to the “Reports” section. If you wanted to update your profile, you might navigate to “Settings.” The system waited for you to find what you needed.
But AI-native products flip that dynamic. They don’t just wait—they respond. Tools and content can appear on the fly, tailored to what the user’s doing or trying to achieve. Instead of navigating through flows, users are met by the right feature at the right moment—sometimes before they even ask.
This shift opens the door to experiences that are:
More immediate – No menus, no searching—just the right thing, right now.
More adaptive – The interface reshapes around context, elevating what matters.
More natural – Conversations replace commands. Goals replace paths.
It’s not just convenient. At its best, it’s exhilarating. It feels alive.
But when the structure disappears, so does the map.
And when the interface keeps changing, it can start to feel like it’s slipping through your fingers.
What Gets Lost When Structure Disappears
When the system reshapes itself in real time, users can lose their footing. Without stable landmarks, it becomes harder to answer basic questions:
Loss of orientation – Without visible pathways or sections, users can’t easily tell where they are within the product—or how they got there.
Hidden possibility – When functionality appears only when triggered by context, users may not realise what’s available. They can’t explore what they can’t see.
Unclear recovery – If a user closes a view, shifts focus, or triggers an unintended state, there may be no obvious way to return to what they were doing.
Weak intent expression – In traditional navigation, choosing a section signals what the user wants. Without that signal, systems rely entirely on prediction—and when predictions fail, users are left without a fallback.
Reduced learnability – Adaptive interfaces can feel unpredictable. When users can’t build reliable mental models, they hesitate, lose trust, or get stuck.
AI introduces fluidity—but it also increases the need for scaffolding. The structure may no longer be visible as sidebars or tabs, but their function—orientation, intent, and continuity—is more important than ever.
Because here’s the irony:
The more a system anticipates, the more it needs to show its work.
The more it adapts, the more users need something stable to hold onto.
What’s Needed Now
AI may dissolve the visual structures of traditional navigation, but the core user needs remain the same. Regardless of how adaptive or conversational a system becomes, users still need:
Orientation – A clear sense of where they are and what they’re working on.
Intent expression – A way to declare what they want, even when it can’t be precisely articulated.
Discovery – Visibility into what’s possible without having to rely solely on system suggestions.
Continuity – The ability to return to previous tasks or contexts without starting over.
Predictability – Consistent patterns that help users form mental models, even in adaptive environments.
To support these needs, we can’t just port old navigation models into new interfaces.
We need new patterns that serve the same functional purposes—scaffolding intent, maintaining continuity, and making complexity feel manageable.
These fall into two main categories:
Structures — Persistent containers that organise ongoing work and give users a sense of place.
Intent Signals — Lightweight, contextual mechanisms for users to express focus and shape what happens next.
Structures
Persistent containers that provide place, memory, and momentum in adaptive interfaces.
When static navigation disappears, users still need something to return to—something that says, you’re still here, your work still matters, and we remember where you left off. Structures give AI-powered systems continuity, while giving users confidence and clarity.
1. Threads
Think of it as: a conversation that remembers everything.
What it feels like: Picking up a chat with a teammate who already knows the context. You scroll back, see past questions, decisions, and suggestions. Everything’s still there—waiting for you.
Example: You start a thread to explore a marketing budget. The AI helps pull reports, summarise vendor spend, and suggest optimisations. A week later, you return—and the thread still holds all those past insights, with new suggestions to prompt the conversation further.
Emotional replacement: From “Where was that tool again?” to “Let’s pick up where we left off.”
Great when used for: Ongoing problem-solving, multi-step decision-making, or complex tasks that unfold over time.
2. Workspaces
Think of it as: a desk with your notes and tools for a specific topic.
What it feels like: A canvas to that grows with your thinking. You lay out numbers, notes, feedback, tools. There’s structure, but it’s yours—maybe you started from scratch, or maybe started from a template or suggested structure. It doesn’t expire when you switch tabs or refresh.
Example: You open your “Hiring Strategy” workspace. On the left: a talent funnel chart and interview feedback thread. In the center: a draft offer letter and a timeline view. It’s everything you need to move forward—all living together.
Emotional replacement: From jumping between five tools to feeling everything is in one, evolving place.
Great when used for: Domain-specific planning, cross-functional collaboration, or keeping all context in one place.
3. Maps
Think of it as: a bird’s-eye view of what you’re trying to figure out.
What it feels like: Zooming out to see all the paths ahead—each node clickable, expandable, changeable. It’s not about ticking boxes—it’s about seeing options and direction. It could be a mind-map, a timeline, a decision tree, or a flow of creative ideas.
Example: You’re crafting a new product strategy. A “Vision Map” lays out goals, user needs, market risks, and experiment ideas—like a living mind map that guides as much as it reveals. Meanwhile a separate panel consolidates into a written strategy document.
Emotional replacement: From “What step comes next?” to “Now I see the whole picture.”
Great when used for: Planning, exploration, or open-ended problem spaces with branching paths.
4. Playlists
Think of it as: a guided walk through unfamiliar terrain.
What it feels like: A smart checklist that unfolds as you progress. Instead of throwing you into a dashboard, it gently nudges: Do this next. Now try this. Great—here’s the next step.
Example: On your first week in a new role, a “First 30 Days” playlist walks you through connecting tools, reviewing past reports, and meeting key collaborators—tailored to your role and pace.
Emotional replacement: From “Where do I start?” to “I know exactly what to do next.”
Great when used for: Onboarding, training, or structured workflows with a clear progression of steps.
5. Collections
Think of it as: a digital drawer you actually want to open.
What it feels like: A flexible bundle of insights, files, tools, and conversations, all grouped by meaning—not format. You drag things in, pin them for later, revisit them with fresh context.
Example: While prepping for a product launch, you pull together competitor screenshots, a pricing model, and a few saved agent chats into a “Launch Toolkit.” It’s personal, shareable, and grows as you do.
Emotional replacement: From “I saved it, but where?” to “Everything I need is right here.”
Great when used for: Gathering relevant materials, organising across sessions, or sharing a curated set of resources.
These structures don’t replicate old navigation—they replace what it gave us emotionally: context, clarity, and continuity. They’re not about menus. They’re about memory.
Intent Signals
Lightweight, expressive mechanisms that help users guide adaptive systems in the moment—without rigid paths.
These are not just shortcuts or UI flourishes. They are how users participate in the flow of a system that is constantly adapting around them. They help users take control without needing a map.
1. Toolbars / Palettes
Think of it as: a persistent panel or on-demand palette of composable tools.
What it feels like: You don’t go hunting for functions—they’re always one click or keystroke away. Need to summarize a timeline? Drop in a summarizer widget. Want to filter a dataset? Drag a tool onto the view. Tools are portable, purposeful, and outcome-oriented.
Example: Like a Photoshop toolbar—but instead of brushes, you have tools like “Summarise,” “Compare,” or “Forecast.” Drag one onto any view, and it works contextually.
Emotional replacement: From “How do I find that feature?” to “This tool is always within reach.”
Great when used for: Making core tools accessible without clutter or switching focus, enabling contextual actions directly on-screen.
2. Expert Agents
Think of it as: an intelligent collaborator with access to all your tools.
What it feels like: You describe a problem, and the agent not only understands—it takes action. It knows what tools are available and how to apply them to achieve your outcome. It’s like bringing in a strategist, operator, and analyst all at once.
Example: You tell the “Growth Advisor” agent, “Help me figure out why Q2 revenue dipped.” It analyses trends, pulls a report, runs comparisons, and suggests next steps—all without you navigating to anything.
Emotional replacement: From “I need to piece this together” to “Someone’s already working on it.”
Great when used for: Tasks that span multiple steps, tools, or domains. Delegated analysis, or requests that require domain-specific expertise.
3. Narratives
Think of it as: a structured journal for shaping goals and outcomes.
What it feels like: You write in natural language—like setting a vision, drafting a plan, or capturing a future state. The system interprets that and turns it into suggested actions, decisions, or structures. It’s expressive input with operational output.
Example: You write: “I want to build a predictable client onboarding process that scales.” The system turns that into a draft checklist, a metrics dashboard, and links to automate key steps.
Emotional replacement: From “I’ll start by organising my thoughts” to “I said what I needed—and now it’s taking shape.”
Great when used for: Turning goals or high-level thoughts into action, especially in planning, reflection, or creative workflows.
4. Focus Zones
Think of it as: an ambient memory of what matters to you right now.
What it feels like: The interface remembers what you’ve been working on—and gently brings it back to the surface. You’re never hunting through history or retracing steps. Instead, the system prioritizes recent or unfinished work so you can pick up with flow.
Example: You’ve been working on vendor costs. Without asking, your home view shows a “Recent Activity” card linking to that thread, plus a suggested new insight based on updated data.
Emotional replacement: From “Where was that thing again?” to “Right, I was in the middle of this.”
Great when used for: Maintaining continuity, supporting re-entry, or softly prioritising what matters most right now.
5. Soft Starts
Think of it as: a helpful nudge toward meaningful action.
What it feels like: You’re not dropped into a blank state—you’re given a lightly pre-filled starting point. It doesn’t assume too much. It just lowers the barrier to entry and gives you something to react to.
Example: You log in and see a card: “Start Q3 Planning”—with a draft workspace partially built based on recent goals, past work, and team inputs. You can edit, accept, or ignore it entirely.
Emotional replacement: From “I don’t know how to begin” to “This gives me a head start.”
Great when used for: Reducing friction in getting started, suggesting next steps when continuing, or seeding templates or task flows to get unstuck.
6. Fast Pathways
Think of it as: autocomplete for actions—refining intent as you go.
What it feels like: You start by typing or selecting a vague goal. The system progressively narrows options, like a conversation. It might look like a navigation menu, but it’s structured around user intent, not application layout.
Example: You begin with “Manage finances,” then choose “Optimize spending,” then “Compare vendors.” In seconds, you’re exactly where you need to be—no sitemap required.
Emotional replacement: From “I’ll figure out where to go” to “The system helps me get there faster.”
Great when used for: Offering menu-like guidance based on goals, allowing vague inputs to sharpen into specific flows, or shortcutting to outcomes when users can’t articulate their needs specifically.
7. Return Point
Think of it as: your personal home base.
What it feels like: No matter how complex the system gets, there’s always one familiar place to return to—your most recent context, focus, or thread. It’s persistent, visible, and centred on continuity.
Example: You reopen the product and see your top thread, recent workspaces, and a “Since you left” card with anything the system processed while you were away. You’re not lost—you’re already back.
Emotional replacement: From “I have to retrace my steps” to “Let’s continue right here.”
Great when used for: Providing a consistent starting place, reinforcing memory, or restoring user flow after time away.
Together, these signals replace the role navigation used to play: not by sending users somewhere, but by helping them move forward—with clarity, confidence, and control.
Bringing It Together
Structures and Intent Signals do different jobs—but they’re deeply intertwined.
Structures provide place. They help users stay anchored across time, tasks, and context. They’re the rooms, the surfaces, the containers of work.
Intent Signals provide direction. They help users express what matters right now. They’re the gestures, the prompts, the moments that move things forward.
In a well-designed AI-native product, these two layers work together seamlessly:
A workspace gives you a persistent home for your Q3 planning.
A soft start nudges you into motion with a draft outline.
You add a thread to explore a new campaign idea.
A palette offers quick access to tools—compare, forecast, tag.
You invite an agent to model budget scenarios.
The return point remembers exactly where you left off next time you come back.
It doesn’t look like traditional navigation. But it gives users the same confidence:
I know where I am. I know what I can do. I know how to move forward.
Wrapping Up
We’re not just redesigning navigation—we’re rethinking how people find their way through software.
AI-native products change the shape of the interface. They adapt, predict, respond. But no matter how fluid the system becomes, users still need to feel oriented. They still need a way to express intent, return to their work, and see what’s possible.
The old scaffolding—tabs, menus, sidebars—gave us structure. But its greatest value wasn’t in the way it laid out spaces. It was in the way it helped users say, “Here’s what I want”, and “Here’s where I am”.
As we move into more adaptive, dynamic products, we don’t need to preserve the visual forms of traditional navigation. But we do need to carry forward its purpose:
To anchor users when surrounded by complexity
To give them control over their journey
To make the system feel like a place—not just a stream of moments
That said, some form of navigation will always be needed.
There will always be quick tasks that benefit from clear paths: updating your profile, changing a password, checking notification preferences. There will always be users who are new, and need a stable surface to begin. Even the most dynamic, AI-native systems will need some kind of backbone—an entry point, a rhythm, a frame. And hey, there will still be moments where users throw up their hands and say “Just give me an escape hatch”.
We’re not replacing all of navigation.
We’re evolving most of it.
And what emerges can be better: more contextual, more expressive, more attuned to the way people think and act in the moment. With new patterns—structures that provide continuity, and signals that guide intent—we can build experiences that are responsive and trustworthy. Fluid and grounded.
Because the goal isn’t just to make software that reacts.
It’s to make software that listens.
That supports exploration.
That helps people feel at home—even when there’s no map.