Four Steps for Deploying a New Analytics Architecture

Using data to understand your client has been the holy grail of analytics teams for decades. Until recently, however, the inability to collect and reconcile data across all the various touch points clients have with your brand has impeded these efforts. In our increasingly digital world, everything has changed. Data from nearly every interaction can be collected and catalogued, thrown into a data lake, and analyzed to understand even the most minute of behaviors. 

But for every solution proffered by this new world of Big Data, it has created a brand new set of problems for analytics teams and the businesses they serve. From multiple executives running disparate Big Data projects to siloed data never making it into the data lake, Big Data projects are not easy. But the biggest challenge comes from figuring out, and then actually deploying, a new analytics architecture. I should know. I’ve experienced some of these problems first hand, when we launched our own Big Data initiative at a large, Fortune 500 financial services organization. The goal was deceptively simple: leverage Big Data so everyone, regardless of where they sit in the organization, has a single, cohesive view of any client. Easier said than done. 

So we set out, first by creating a data lake that would store all of our various client data, from our CRM system, Web usage stats, Transaction records, client surveys, events, phone calls, etc. in one spot. We knew we needed a centralized analytical store that could meet fulfill our Big Data needs in an enterprise context. After evaluating various products, we decided to utilize Hadoop. 

As we began rolling out the project, the tectonic technology plates shifted again. This time, it was the move to the cloud. As the economics and ease of use became apparent, we realized storing everything locally wouldn’t work. So the second leg of our analytics architecture adventure began, and we embarked upon moving our data to Amazon Web Services. 

It wasn’t easy, but now, three of years in, we’re finding tremendous value from the project, while continuing to optimize for ROI. As we progressed on this journey, I found four key learnings that can make any new analytics architecture initiative not only more successful, but seamless and cost effective. 

Step One: Educate Your Business

Undertaking any major project in a large company requires educating key stakeholders and executives to ensure the project runs as smoothly as possible. Deploying a new analytics architecture requires the same alignment, or you risk catastrophe. 

In my own experience, not getting every business unit leader on the same page led to different groups blindly copying data from various data repositories into Hadoop clusters. Everyone struck out on their own, seeking to reap the benefits of a big data project without understanding the complexity of such an initiative. This meant that when leaders went in to analyze the data, different platforms would give different answers, decision risk occurred, and the whole project was jeopardized. As we moved to AWS, different teams across different business units repeated the same mistakes, confounding the issue. 

By educating different business leaders of the security risk and data redundancies that can result by taking their own action, and getting them to agree these problems must be solved (ideally before they crop up), you can mitigate many of the issues I faced in rolling out a new analytics architecture. 

Step Two: Build a Central Infrastructure

If you’re planning to deploy a new analytics architecture, building a central infrastructure can be a tremendous asset in setting you up for success. In our case, we had to decide between having Hadoop on premises or going to the cloud. Determining what works best for you and your business requires careful consideration, but don’t be fooled into thinking any perceived economics of a distributed infrastructure offset the impact it will have on your analytics initiative. You’ll end up with multiple answers to the same question, data inconsistencies, security risks - basically, a mess no one wants to continually clean up. 

Instead, begin with a centralized repository of all your client data. This could be Hadoop, a cloud provider like AWS, or an enterprise data warehouse on premise. Once you’ve created this repository, you can think about overlaying different tools, such as BI reporting or advanced analytics platforms, as resources on top of it. Only then do you want to expose the data - and by extension, the insights - to various business units and individuals. This centralized infrastructure is the backbone upon which everything else is built.

Step Three: Select Your Solution

Once you’ve educated your company and established a centralized infrastructure, you’re ready to select the solution best suited to your business to lay on top of your data repository. Begin by identifying vendor solutions; seek out those that fill gaps specifically tied to business value. While having the ability to whip up a slick data model or leverage predictive analytics might seem impressive, if your solution doesn’t squarely address gaps related to business performance, it’s not going to impress the rest of your company in the long run. 

A warning here: avoid vendors that provide black box solutions. If someone says ‘just hand over your data - we’ll provide the answers’ without giving you context or clarity on how those answers were generated, run! At that point, you’re not selecting a solution, but a service. There are some exceptions, such as specialized point solutions that require third party expertise like householding, but in general, you need to know how a solution arrived at an answer if you want to run the rest of your analytics project on top of it. 

Step Four: Bring it to Business Units

If you’ve educated your company, built a centralized infrastructure, and identified the right solution to run on top of it, you’re nearly ready to reap the benefits. The last step is exposing the data and insights to business units. Just as educating leaders was critical in the first step, it’s just as important in the last leg of rolling out a new analytics architecture. This time, you need to educate business units about the new way of doing it. Education has a number of benefits—users will feel more comfortable with the solution, they’ll get value from it more quickly, and you’ll get immediate feedback.

I can’t understate how critical democratizing data is during this phase. While you want a centralized infrastructure and data governance, if you end up with a centralized team of analyst and report creators, you’re severely shortchanging your program. Front line employees should have access to the data they need, through tools they can use with confidence. 

If you follow these four steps - educate your business leaders, centralize your infrastructure, select a solution you can trust, and then expose the data and tools to your business units - you’ll have a new analytics architecture that can propel your business for years to come.