Consider any data analytics project, and you’re going to focus on its success. Before all the cool, headline-grabbing stuff can happen, you need to lay the foundations. You need to consider the technology that you’re going to use. Each part of your data analytics platform needs to work together to create the bigger picture.
There is a myriad of choices available when choosing your core analytics platform. Many have different features and can be bewildering when trying to pick between them. To start, make sure you ask yourself the following:
You cannot do any analysis without data. That requires a well-thought-out ingestion process. It forms the building blocks to your analytics platform. When your analytics platform is increasingly called-on, a solid ingestion process allows you to churn through the requests quickly and effectively. In other words, you can get your analytics done much faster.
Good ingestion requires the right tool. This can be created in-house, or you can buy an off-the-shelf ETL (extract, transform, load) tool. Whatever you decide, your data ingestion needs to do a couple things:
Ingesting and storing your data is all well and good, but if people can’t get to it then it’s not worth much. You must invest in a range of tools from standard reporting, to raw JDBC access for custom projects. This allows everyone, whatever their skill level, to use the analytics platform and see its value.
Speaking of which, you also need to govern and administer your platform properly. This is where management tools come in.
It’s not all about ensuring you ingest your data properly: you have to build methods to get it back out too. A consistent approach for all implementations is called for, and usually you can use either the batch method or an API.
Failure to do this means that all the interesting stuff that you’ve done on your platform won’t get shared. It needs to be shared with other systems to be of value. For example, providing recommendations based on customer purchases on an e-commerce site, or dynamically changing the prices of tickets according to demand. APIs are a popular method – especially if you have to do something in real-time.
When setting up your analytics platform, it’s worth knowing about the factory and the lab environment.
The lab
In the lab, you can trial your use cases before moving them over to the factory. It’s an area where people can upload their own files to combine with your existing data or create their own tables.
This area should be self-managed and offers freedom for people to innovate. By having a lab environment, you avoid departmental or team silos, but still offer an area for each individual team to test new ideas. All whilst maintaining the same set of standards and practices.
If a use case if proven in the lab, then it can move to the factor. The lab basically allows you to invent and test things first, without spending a lot of time and resources implementing it in the more complex factory.
The factory
The factory is the place in your analytics platform where things go live when they are ready to be rolled out across the company. It comprises the raw, base and analytics layer. Think of it as the central data store for your entire analytics function.
Your data analytics platform is fundamental to getting value from your data. Don’t expect to choose a platform immediately. There are many factors to consider and your choice could make or break your data strategy. Don’t be afraid to seek advice and recommendations, or to test some different options first to see whether they are suitable for your organisation and set-up. You’ll want your analytics platform to work for you for a long time. So take your time when picking a solution.
If you found this article valuable, you might be interested in our next webinar: Considerations for a Successful Data Platform. This hour long session is led by James Lupton, coaching leaders in business, data and tech on how to build a data platform.
Cynozure research calls for stronger data leadership in US financial firms to unlock data benefits
A CLEAR™ Path: How to Foster a Robust Data Culture by Leveraging Concepts from Industrial and Organizational Psychology
Outsmarting the Bots on demand: A practical application of machine learning