In part 1 of this three-part series, we looked at the reason why so many organisations are currently rethinking what to do with their data as they modernise their SAP systems and start to plan for a cloud-based, AI and Machine Learning-enabled future.
In this blog, we look at how to leverage your SAP data for the benefit of your wider business.
Should you choose a SAP data warehouse solution if you use SAP ERP?
If you are modernising your SAP deployment and moving it the cloud, you will also have to make a decision about the best place for data you want to analyse to reside. SAP offers different warehousing options including its traditional data warehouse solution BW and BW/4HANA, (for structured enterprise data and analytics) and its newer SAP Datasphere (for integration and orchestration of data across different SAP and non-SAP data sources and platforms.
Just because you use SAP as your ERP system however, doesn’t mean you have to use a SAP data warehouse solution. SAP BW/4HANA won’t deliver the computational elasticity, variety and compute that cloud-native solutions offer and doesn’t connect to data outside of SAP systems. While SAP Datasphere offers connections to non-SAP data sources, it needs to be used in combination with other SAP products like BW, HANA Cloud and SAP HANA Data Lake. This adds to the complexity, costs and rigidity of the system and the risk of vendor lock in. Neither SAP offering provides secure cloud-based data sharing across both internal organisational boundaries and external partner suppliers / customers.
Many organisations are already moving data from SAP BW into a data warehouse or data lake because it is easier than getting non-SAP data into SAP. And users who add non-SAP data into SAP BW may not be able to export it again if they don’t have a SAP NetWeaver OpenHub license.
A better alternative
Catalyst has found a better solution: using cloud-based data platforms like Snowflake. Here’s how it’s done:
1. Data Extraction
Data is extracted from the SAP ERP system, either SAP ECC on premise or S/4HANA in the cloud. We believe this is best done using an ELT approach (Extract, Load, Transform) where raw data is loaded directly into the target data warehouse. Data is moved without any changes made, rapidly increasing the speed with minimal compute time. In contrast, more traditional ETL (Extract, Transform, Load) uses a set of business rules to process data from several sources on a secondary server before centralised integration. This slows the process and uses far more compute resource, throwing away data which may prove useful in the future. The advent of cheap, scalable cloud compute and storage means there’s no need to exclude any potentially valuable data from your analytics platform.
Tools like Qlik Cloud Data Integration (QCDI) extract relevant data from SAP, ingest it into Snowflake in real time and transform it to analytics-ready data. As many organisations have complex, customised data sets, we use accelerators - like QCDI’s Compose or Replicate with out of the box functionality - or our own purpose-built accelerators to put pre-existing code in place to aid the extraction of customised data sets. This automates the design, implementation and updates of data models while minimising the manual, error-prone design processes of data modelling, ETL coding and scripting. This massively accelerates delivery and the transformation of the complex table structures within SAP.
2. House your data in Snowflake
Catalyst can work with you to move your data from SAP into Snowflake, starting with a proof of concept, testing the system with a subset of data.
Snowflake is a fully managed SaaS solution that provides a comprehensive, single platform for data warehousing and analytics, data science, data application development, and secure sharing of real-time data. It runs on all three major clouds - AWS, Azure and Google Cloud (or a combination of all three) - avoiding cloud vendor lock in. And unlike so many analytics platforms, it was built from scratch to run in the cloud.
As an elastically scalable data warehouse, it automatically scales up or down compute resources as needed. This means you can run any number of workloads across multiple users at the same time without competing for computing resources, so performance is not affected. Storage is also managed automatically, eliminating the need to build indexes or do housekeeping. It also has virtually zero downtime, unlike standard weekly SAP shut-downs to increase or decrease compute or to run upgrades.
"Choosing the Snowflake and Qlik data integration solutions was a no-brainer as they perfectly aligned with our business objectives. Now, armed with a solid foundation and a clear direction, INEOS Automotive is ready to harness the power of data and analytics to fuel innovation and achieve remarkable success."
Sailash Patel - Head of Data and Analytics, INEOS Automotive
Snowflake can work with data from SAP systems - both on-premises and cloud - as well as third-party systems, and sensor data for machine learning, regardless of whether it’s in structured or semi-structured formats or even if the data structure in the files changes. And it offers data security as all data is encrypted as a built-in feature. Snowflake’s warehouse is also a lower operational cost than SAP’s Warehouse, with efficient internal technology that’s continually being improved by Snowflake’s own engineers. Snowflake warehouses can be automatically shut down and rapidly restarted as needed, with billing only when they’re in use down to the nearest second, significantly helping to reduce costs.
With its ability to draw data from any source, whether structured or unstructured, Snowflake enables you to build 360-degree views of customers, products, and the supply chain. This includes interweaving data sets from business partners and data from the Snowflake Marketplace. The Marketplace is Snowflake’s ‘shop’ for data, providing a wealth of authoritative data to complement your own. Later, you might even prepare your own datasets, masking and obfuscating certain fields, for sale through Snowflake’s Marketplace as a way to monetise your organisation’s own data.
3. Use analytics to get real-time insights to make better business decisions
Once your SAP and non-SAP data is in Snowflake, use a data visualisation tool like Qlik Sense or Power BI to interrogate, analyse and visually represent your real-time data. This creates meaningful insights so that colleagues across your organisation can make better business decisions.
Catalyst and Qlik offers accelerators for different departments like procurement, finance, sales and after sales, so that they can self-serve, building their own reports. We also have knowledge we can draw on from many specific industry sectors, such as manufacturing, healthcare and legal services, to help make visualisations more relevant and “in tune” with the way business users work. This relieves the burden on the data team who can focus on more complex, bespoke analytics for key corporate initiatives.
For example, in retail or hospitality, combining sales data direct from tills with external geographic, demographic or weather data can help identify whether outlets are well-placed, and presenting the ideal proposition to consumers. Logistics organisations can track stock levels and customer demand, to help tune production and adjust labour levels at depots. Predictive analytics can indicate issues likely to happen within a fleet of vehicles, production lines, stock levels or even customer churn. Insights can even be shared, without any copying of data, with partners, to facilitate collaboration on, for example, joint marketing exercises.
The possibilities are endless, and it’s important to realise that whether you’re looking for these insights or not, some of your competitors almost certainly already are.
4. Retire old architecture
Reduce spend by shrinking your SAP BW footprint to only what is required, retire your HANA sidecars (secondary databases needed purely to replicate data), and archive any historical ERP data from ECC/S4 to Snowflake to free up storage and enable historical analysis in Snowflake.
5. Lay the foundation for the future of AI
Organisations that win in the future will be those who have well organised data to form the foundation of AI and machine learning initiatives. Whilst most companies use data scientists to analyse structured data to make forecasts, AI, particularly GenAI, can access both unstructured and structured data. What’s more, companies who want a holistic data view need access to not only their own data, but also external data from partners and suppliers, or service providers and data marketplaces.
Organisations also need to be able uphold essential data security, governance and regulatory issues. To achieve this, data needs to be unified in a comprehensive repository so that different workgroups can access it easily and securely. The best way to achieve this is by using a cloud data platform that integrates closely with your existing corporate security systems, tightly coupling governance of the data, as well as the ability to integrate with third party data, while managing an array of data formats.
Snowflake’s architecture - which supports both unstructured and semi-structured data - enables easy data preparation for machine learning model building. Data can be structured, tabular (like SAP); semi-structured data from IoT devices, web-blogs, and other sources; or unstructured data, such as images and PDF documents.
Conclusion
Choosing where you house the SAP data you want to do analytics on has implications for your entire organisation. It will affect the speed at which you can modernise, the costs you incur, the ability to scale, and the democratisation of data so that all business units can make informed business decisions. You may want to federate your analytics, with a self-service model that reduces reliance on a central analytics team. And you will certainly need reliable security, strong governance, scalable processing speeds and storage, and a future ability to implement AI and machine learning initiatives.
Maintaining your data in a SAP system may seem like the obvious choice. But we hope that this blog has shown you how you can disrupt the status quo, unlock your SAP data and transform your business in the process.
In the final part of this blog series, we will look at how INEOS Automotive used Snowflake and Qlik to house its SAP data and the business benefits they are already receiving – as well as their plans for the future.
If you would like to speak to an expert from Catalyst on the best solution for your SAP data, please get in touch.
COMMENTS