
Data-driven off a cliff: why dashboards are dead
Data-driven off a cliff: why dashboards are dead
Dashboards used to be the command centre for product teams. The playbook was simple: instrument events, define funnels, build dashboards, watch the numbers.
Something going up and to the right? You're cheering.
Something dips? You investigate.
But the way product teams access their key metrics and insights has always been riddled with problems.
- Insights live with the wrong people. Most dashboards are built for analysts, not the PMs and designers who are actually making product decisions. They rely on event tracking rather than visual context, which means the people closest to the product are left second-guessing whether they're even looking at the right thing. Was it "upgrade_sidebar_btn" or something else? Depends who you ask.
- Tracking can't keep up with shipping. Configuring a dashboard means setting up event tracking, which means going through engineering or data analysts. So as your product ships faster, the gaps widen. Events go untracked. Features go unmonitored. Key issues and opportunities slip through the cracks quietly, with nothing in your tooling to flag them.
- Getting answers is painfully slow. You have questions about how users are experiencing your product but no clear path to answers. Dashboards are fragmented across tools, often reporting conflicting numbers on the same part of the product. And if you want to dig deeper, you write a ticket for the data team. It sits in the queue for two weeks. By the time the answer comes back, the context is gone and the decision has already been made.
And this is the tip of the iceberg.
Yet teams are still making their biggest product decisions in this outdated way.
Dashboard delay
Dashboards are retrospective by nature.
By the time a metric dips enough for you to notice, the problem has already been impacting thousands, sometimes millions, of users. The data you're looking at is an aggregate. It smooths out the edges.
So that conversion drop you spotted on Monday?
It probably started two weeks ago with a small subset of users hitting a broken state you didn't instrument. It grew quietly until the numbers were bad enough to show up in your weekly review. Dashboards don't surface problems. They confirm them, long after the damage is done.
That dashboard you set up is only as good as your foresight. You only measure what you think is important when you first build the feature. That's a lot of pressure, and most teams get it wrong, missing vital insights.
Dashboards show the symptom, not the cause
A dashboard is really good at telling you something is wrong.
Conversion dropped. Retention softened. Activation is down. Revenue per user dipped. Your funnel is leaking.
Cool. Now what?
We all know the drill. Your manager or founder messages you on Slack: "What's going on here?". You and your team go on a wild goose chase trying to get to the bottom of it. If you're lucky, you'll figure it out the same day.
But in most cases you'll spend days digging through data, writing queries, and pulling in engineers before you find the root cause.
Dashboards make this diagnosis particularly hard because there's no product context. Nothing showing you the exact part of the product where the issue happened, or which users were affected.
But if you had a visual of the impacted sessions, like a recording of what users actually experienced, you might spot the issue straight away. Something as simple as a cookie banner sitting over the sign up button on mobile.
A picture tells a thousand words. A dashboard gives you a number.
Dashboards lack visual context
Product builders are visual people.
Events and charts are a foreign language used to describe something inherently visual.
There's product UI, bugs, and localisation hiding behind every metric you're trying to make sense of. It's impossible to get the full picture from a chart, let alone a true understanding of what your users actually experienced.
That language breakdown creates information silos and decision-making complexity. You and your team can all walk away with a different idea of which part of the product the dashboard is even referencing. Is "upgrade_sidebar_btn" on the settings page or the billing modal? Depends who you ask.
The classic version of this: a metric drops and the team huddles around a chart. They slice it by device, region, cohort, day of week. They debate seasonality. Someone pulls a comparison to last quarter. An hour in, nobody has a definitive answer. Then someone loads up the product in a test account and sees it immediately. A recent release broke a core flow in a way that was obvious the second you looked at it. The chart could’ve never told you that.
Data reporting hasn't kept up with how product building has evolved, and it's costing teams.
The local maxima trap
Optimising what is measurable can look like progress while the experience quietly worsens.
Your activation rate ticks up. Retention looks stable and the team feels good about the numbers. But somewhere in the product, users are hitting a flow that's confusing, clunky, or slightly broken in a way that never surfaces cleanly in a graph. You've optimised yourself into a corner. The chart says you're winning. The product tells a different story.
You keep improving the things you can see, while the things you can't see quietly compound. And by the time the metrics catch up to the real experience, you've spent months moving in the wrong direction.
AI means an answer first world
There’s a behavior shift happening that analytics tools are completely ignoring.
People no longer want to navigate information.
They want to ask questions.
AI has rewired expectations for everyone. People want to ask:
- “Why did activation drop for new users in Germany last week?”
- “Which screen is causing the biggest drop-off on iOS?”
- “What changed after we shipped the new pricing flow?”
- “Who is most impacted by this regression?”
And they want an answer quickly.
They don’t want to click through dashboards, drill into filters, memorize event taxonomies, and reverse-engineer the product like it’s a spreadsheet.
Getting an answer from a dashboard means navigating to the right chart, interpreting what you're looking at, forming a hypothesis, and then opening three more dashboards to test it.
The modern expectation is: question → answer.
But the shift goes further than just asking questions.
With AI, you don't even have to ask. Agents can run 24/7, continuously surfacing anomalies, spotting issues, and flagging moments that need attention before anyone thought to look.
The best teams aren't digging for insights anymore. They're being alerted to them. Analytics data no longer has to sit dusty in a dashboard waiting for someone to remember to check it.
Speed changed the game
Three years ago, writing code was something reserved for engineers.
Now it's something anyone can pick up. PMs, designers, founders, marketers are all starting to ship directly into production.
That's mostly a good thing. Shipping velocity has never been higher.
But codebases are accumulating bugs, broken flows, and untracked events faster than teams can keep up with.
People who were never trained to think about instrumentation are merging code that affects user journeys nobody is measuring. Event tracking can't keep pace with the rate of change.
Teams fly blind on metrics that used to be reliable, and the gaps compound quietly with every release.
Traditional analytics tools haven't caught up to the new way of work.
Manual instrumentation made sense when a small, disciplined engineering team controlled every line of code. It doesn't make sense anymore.
Product teams need analytics that automatically detect new screens, flows, and UI changes as they ship. Because the pace of modern product development has made manual tracking structurally difficult.
The future of product building
This is exactly why we built Adora.
We kept running into the same problem. The data was there, but the shared understanding wasn't. You'd stare at a chart, know something was off, and then spend days trying to connect the dots between a metric and the actual product experience.
We thought, what if you could just see it? Your live product experience, mapped out with real screenshots of your product, overlaid with analytics and insights on what's actually happening.
A visual map of how users move through your product. Where they get stuck. Where they drop off. And why. With the ability to prompt and ask questions about your product experience and user behaviour, right there in the moment.
Dashboards are dead. Visual context is the future.
Related posts

SaaS Pricing Pages to Sign Up Journeys
This teardown analyzes SaaS pricing pages and their connected sign up journeys. Learn how leading SaaS companies design pricing, CTAs, and sign up flows that reduce friction and increase conversion.

50 B2B SaaS Homepage Teardown
This teardown analyzes 50 B2B SaaS homepages to uncover the patterns that drive clarity, trust, and conversion. Learn how leading SaaS companies structure messaging, visuals, and CTAs to turn visitors into qualified leads.

AI Tools Guide for Product Builders
This guide breaks down the most useful AI tools for product builders across design, development, research, automation, and feedback. Learn how product managers, designers, and founders use AI to prototype faster, clean up code, run research smoothly, and automate repetitive work.