Company Building

Founding ThoughtSpot - Why We Picked The BI Market and The Data BI Access Problem | ThoughtSpot

In my ten years in the valley, I have learned that a fairly reasonable order to approach things while starting a company is:

  1. Team first
  2. Market second
  3. Problem third
  4. Solution (aka idea) last

There are many paths to success and there is no one “right way” of doing things, but this sequence has worked well for me. In 2012, we (me, our co-founder/CTO Amit Prakash and our co-founding team — spent a good part of the year forming the founding team, studying the business intelligence market and learning about the problems that plagued that market. Even though we had recently raised our $3.5M seed round from my second time partner-in-crime Ravi Mhatre at Lightspeed (Ravi also funded my first startup, Nutanix, that went on to become the largest tech IPO of 2016), we did not start building our product until almost nine months later. We wanted to pick the right market and the right problem before we started building a solution.

I was exposed to the analytics market during my time at Oracle and then even more deeply at Aster Data, one of the first “big data” companies. According to industry analysts, the world spends over 60 billion dollars every year on BI software and implementation services. If we expand this to also include ETL and data warehousing, which represent the full pipeline that is required to turn business transactions into analytical insight, the world has spent CLOSE TO A TRILLION DOLLARS over the past decade trying to make this pipeline work.  This includes about $20B per year for BI and $80B per year for other data management technologies such as data integration, data migration, and data warehousing—and that doesn’t even count the services required to implement and manage complex BI software or the hardware those technologies run on! The last company to make a meaningful dent in this market, Tableau, was started in 2003, based on the PhD thesis of Chris Stolte who was at Stanford from 1997 to 2003. These two facts alone (a lot of money being spent, the last big company being a decade old) told us that there’s an opportunity to build something big and disruptive in this market. I have the utmost respect for the impact Tableau has had on the analytics market. Unlike the plethora of copycat BI players that I see in the market, they had original thoughts and turned them into a company that has made a big impact on the world. 

Digging into the Pain

As we dug deeper, we only became more convinced. Despite the big dollars being spent, getting access to data was still a pain—it was still taking people days if not weeks to get a report or a dashboard out of their BI tools. A ton of money was being spent on BI products and services, the technology was becoming increasingly outdated, and the industry had a very pervasive problem—the end consumers of data (aka the “humans”) were still very frustrated because they were dependent on experts ( analysts).

The third thing that convinced us to go after the BI market and specifically the problem around data accessibility was the magnitude of change since Tableau was founded. One change was scale of data—much ink has been spent on the scale of data being generated so I’m going to skip that. But a much bigger change happened between 2003 and 2012 in the consumer technology space—everyone, including the “humans” needing data at work, was consuming information through Google, Facebook, LinkedIn, Kayak, and other consumer search engines. with ZERO dependency on experts.  They could go to their favorite website and just find what they were looking for. If their first search did not yield exactly the coffee shop or the shoes they wanted, they would just change their question.  The cost of changing the question was zero. We saw this change in human expectation of technology as an opportunity—why couldn’t BI be like that? What if we could make data consumption as frictionless as the consumer web? So we fully committed ourselves to this market and set out to define how we’d approach the market and the problem.

Why ThoughtSpot: Data As A “Human Scale” Problem

For the last decade, the broader data industry has primarily focused on data scale as the problem to solve, as if just being able to manage a lot of data is going to create value by itself. This has resulted in a lot of “big data” projects failing and organizations being underwhelmed with the ROI of these projects. 

At ThoughtSpot, we have a different take on data. We believe that data is actually a human-scale problem.  How does the world get data in the hands of almost a billion knowledge workers right when they need it and not two weeks later. 

Imagine you are planning a trip to Dallas and need to find a nice restaurant where you can meet your client. Now imagine filling up a form to find the restaurant and submitting it to Google with all the details you can think of (neighborhood, preferred cuisine, etc.). And someone from Google gets in touch with you a week later - “Thank you for submitting your request to search for a restaurant. I was wondering if you’d like formal dining or something more casual?”. You’d be done at that point. You would stop asking Google. 

This is where the BI world is today - the humans (marketing, HR, finance, operations, and sales professionals that are not data experts) have to depend on an expert to craft the dashboard that will answer their business question.  We believe that to solve the problem fundamentally, the problem has to be seen as a human-scale problem and not as a data-scale problem alone. 

Don’t get me wrong—solving for human-scale data access implies solving for data-scale by definition, but data is only a means to an end; the end is to help humans do their jobs better. As we approached the problem from the other end, we realized that the solution would end up being quite different than what the industry was trying to do—creating big data lakes, throwing a bunch of data in there and connecting their existing BI tools to it, then hoping for the best. Being able to store a lot of data in a data lake does not matter. You might have a petabyte of data or maybe even two, but if only a handful of experts know how to access it, it will have a very limited impact on your business.On the other hand, if you gave true data access to a million human beings—whether your employees, customers, or partners—even if it was only a hundred gigabytes, it would have a huge impact on your business.

The Platform Starts Taking Shape

In the summer of 2012, we set out to build a company that will solve this very problem - instant access to data at human scale. In the six months that followed, we fleshed out our solution concept and realized that there was a problem—we would have to build three startups inside one to really do it well. And well we did want to do. In my next post, I will share what we learned in those six months and how we took a “platform up” approach to building our product.