If you've ever built what you thought was the perfect dashboard only to have users immediately ask "Can I export this to Excel?", you know the pain of legacy BI.
During the Nightmare on BI Street webinar, Steffin Harris, Field Chief Data and AI Officer at ThoughtSpot, and Andrew Turner, Ecolab Senior Manager of Process Insights at Ecolab, shared real stories of BI horror.
What emerged from their conversation wasn't just another vendor pitch, but a practical roadmap for escaping the legacy BI trap. Here's what business leaders, data leaders, and analysts need to know about modernizing analytics without the nightmares.
The Real Cost of Legacy BI Nightmares
Ecolab's journey from legacy BI chaos to modern analytics provides a concrete example of what transformation looks like in practice. As a global sustainability leader serving customers in more than 40 industries, Ecolab needed analytics that could scale across diverse business units and geographies.
Legacy BI systems like Tableau and Looker create a vicious cycle that many organizations know all too well.
As Andrew Turner from Ecolab explained, their primary legacy BI tool simply "couldn't keep up" with data size and volume, forcing them to split semantic models into multiple regional and ERP variants. This fragmentation led to inconsistent metric definitions across regions and endless time spent reconciling numbers instead of generating insights.
The human cost was equally devastating. Business users who weren't familiar with desktop BI tools had to rely on analysts for even simple changes like switching a field filter or adjusting colors. This dependency created what Turner described as "a backlog of support needs that just continues and never really seems to end."
"You can only do so much if, a, trust in the data isn't there, and then, b, that the solution getting built isn't flexible enough to handle the next question that the users are gonna have," Turner noted, capturing the fundamental flaw in traditional BI approaches.
The symptoms are universal across organizations:
Analytics teams are spending more time on change requests than on strategic analysis
Users are reverting to Excel because dashboards can't answer follow-up questions
Multiple versions of "truth" across different systems and regions
Weeks or months required to build new analytical models and train users
Executive frustration with inconsistent KPIs and delayed insights
These aren't just technical problems—they're business problems that prevent organizations from becoming truly data-driven. The solution requires rethinking how analytics platforms should work.
Hear directly from
Building Trust Through Transparent, Governed Analytics
Trust forms the foundation of any successful self-service analytics initiative, yet it's often the first casualty of legacy BI implementations. As Steffin Harris emphasizes:
"When thinking through what matters most, trust in the data should be top priority. Accuracy of that data comes along with the trust, but also thinking through how do we actually deliver an experience, in a way that offers that flexibility to your end users."
Modern analytics platforms address trust through transparent governance and semantic layers that eliminate the "AI black box" problem. Unlike traditional BI tools that hide their logic, agentic analytics platforms like ThoughtSpot's Spotter provide explainable insights backed by governed data models.
The key is establishing what Turner calls "a solid foundation" through proper data modeling. While this might seem like slowing down initially, it enables organizations to scale analytics across multiple datasets and geographies without losing consistency.
As Turner put it: "Sometimes you need to slow down a little bit to ultimately end up speeding up.”
For Ecolab, this meant consolidating their most important KPIs, metrics, and SLAs into their data warehouse and modeling them for broader business access. The result was unprecedented visibility across multiple ERPs and geographies without losing any of the granularity behind order-level, transaction-level details.
Practical steps for building trust include:
Implementing unified semantic models that ensure consistent metric definitions across all systems
Establishing clear data lineage and governance policies that users can understand and verify
Creating transparent AI explanations that show how insights are generated
Providing audit trails and SOX compliance capabilities for regulated industries
Enabling business users to validate results against source systems easily
Trust isn't built overnight, but organizations that invest in transparent, governed analytics see dramatically higher adoption rates and more confident decision-making across all levels.
Empowering Self-Service Without the Chaos
True self-service analytics means empowering business users to get answers without creating chaos for data teams. The challenge lies in balancing flexibility with governance—giving users the freedom to explore while maintaining data quality and consistency.
Turner's experience at Ecolab illustrates this balance perfectly. Before ThoughtSpot, their analytics landscape "looked a little bit more like the wild west" with Access databases, PowerPoint results, and what he called "the daily change, Frankenbeast of Excel documents." Everything was manual, and there was constant confusion about which numbers were correct.
The transformation came through empowering users with governed self-service capabilities, as Turner shares below:
Modern analytics platforms achieve this through natural language search capabilities that let users ask questions in plain English while maintaining governance guardrails. Spotter’s search-based tokens, semantic layer, and ranking algorithms ensure accurate results: like zero hallucinations.
The business impact is immediate and measurable:
Reduced BI backlogs as users can modify dashboards and explore data independently
Faster decision-making with insights available in seconds rather than days
Higher user adoption rates when analytics tools feel intuitive and responsive
Freed-up analyst time for strategic work rather than routine report modifications
Improved data literacy as more users engage directly with data
The key is ensuring that self-service doesn't mean self-destruction. Users need the freedom to explore within governed boundaries that maintain data quality and organizational standards.
Ecolab's Transformation: From Manual Chaos to Automated Insights
Their transformation focused on critical business processes, particularly the month-end close cycle that every finance organization knows well. Before the shift to automation, Ecolab’s month-end close process was a manual grind—reliant on Excel-heavy workflows, disconnected data, and constant back-and-forth to surface key metrics on time.
The new approach automated much of this manual work. Key measurements that previously required "hunting down someone for a KPI" and getting information "into a deck" by day five are now available through automated alerts and notifications. Users receive metrics directly in their inbox and notification bar, eliminating the chase for critical financial data.
One particularly impressive example involved complex billing analysis for finance users. Using legacy processes, this model building, visual creation, training, and rollout would have taken weeks, if not months. With ThoughtSpot, they completed the same analysis much faster while also meeting SOX compliance requirements—a critical consideration for public companies.
The results speak for themselves:
Eliminated manual month-end reporting processes that previously took days
Consolidated multiple ERP systems into unified analytics views
Reduced complex billing analysis from months to weeks
Achieved SOX compliance while accelerating business insights
Gained executive adoption and support through improved visibility
Created a foundation for scaling analytics across global operations
"By leveraging the data in this way, we've seen some of the additional executive adoption and usage improving behind that," Turner noted. This executive buy-in became crucial for driving broader organizational change and encouraging other business units to adopt similar approaches.
The Power of Spotter: AI-Native Analytics That Actually Work
While many vendors have bolted AI onto existing BI platforms, ThoughtSpot built Spotter as an AI-native agent from the ground up. This architectural difference matters significantly for organizations seeking reliable, trustworthy AI-powered analytics.
The platform's "no dead ends" philosophy means users can launch conversations from dashboards, mobile devices, or direct queries, then seamlessly move from insight to action. For example, if analysis reveals an impending stock-out, users can immediately trigger transfers between locations or send alerts—all within the same interface.
Spotter delivers fast time-to-value with minimal training. The platform quickly learns your business terminology and context, so users don’t need to memorize exact field names or data structures. That makes it easy for them to start asking real questions in their own words—driving faster adoption, more accurate results, and greater confidence in the insights they uncover.
Key capabilities that differentiate AI-native analytics:
Zero hallucinations through comprehensive guardrails and semantic understanding
Natural language queries that understand business context and terminology
Seamless transitions from questions to insights to actions within workflows
Real-time compilation and query execution for immediate results
Integration with existing data stacks, including Databricks, Snowflake, and dbt
Mobile-first design for analytics anywhere, anytime
"Being able to start a Spotter conversation from an answer on an already verified liveboard adds a lot more confidence into the responses that you're getting," Turner explained. This approach bridges the gap between traditional dashboards and conversational AI, giving users the best of both worlds.
Change Management: From Skeptics to Superfans
Even the best technology fails without proper change management. Turner's experience at Ecolab provides valuable lessons for organizations planning their own analytics transformation.
Initial training efforts focused on generalized approaches, but Turner found these lacked "stickiness" with users.
The breakthrough came through tailored, function-specific training using actual business data, and hands-on practice proved essential. Rather than just watching demonstrations, users needed to perform tasks themselves during training sessions.
Turner emphasized: "Actually doing it themselves and kind of figuring and kind of going through the process, and having to think about it, I think really, helped a lot a lot with that as well."
Successful change management strategies include:
Function-specific training using real business scenarios and data
Hands-on practice sessions where users complete actual tasks
Identification and support of power users who become peer advocates
Demonstration of how new tools solve existing pain points
Executive sponsorship and visible usage of analytics tools
Patience with the learning curve while maintaining momentum
Building Your Escape Route from BI Nightmares
The path from legacy BI nightmares to modern, agentic analytics isn't just about technology—it's about transforming how your organization thinks about and uses data. The experiences shared during Nightmare on BI Street provide a clear roadmap for this transformation.
The difference between truly agentic analytics and traditional BI with AI add-ons will only become more pronounced as organizations demand faster, more reliable insights.
Ready to escape your own BI nightmares and build analytics that actually work? Book your demo today.



