Getty Images

SAP S/4HANA migration needs careful data management

Dealing with data is one of the most challenging aspects of an S/4HANA migration as customers must decide what data to move to and how to migrate it.

An S/4HANA migration can be a daunting prospect to customers. They may not know how to handle migrating data from the old to the new system.

Getting that part right is critical. Moving data that is obsolete, incorrect or redundant into the new system can pose serious risks to the S/4HANA implementation, according to Ben McGrail, managing director of Xmateria.

Therefore the move from legacy SAP systems to S/4HANA hinges on establishing what SAP calls "a clean core" -- a simplified system that doesn't require extensive customizations. This will then allow for the promised S/4HANA innovations, including new capabilities for AI and analytics. But because SAP environments typically include massive amounts of data in a variety of types, customers will likely encounter data migration challenges, according to McGrail.

In this Q&A, McGrail discusses some of the issues involved with data in an SAP S/4HANA migration. Xmateria is a London based firm that provides S/4HANA migration services and products. This includes the recently released Pioneer, a data discovery platform that provides companies with a view of the amount and types of data across the SAP landscape.

Editor's note: The following was edited for length and clarity.

Can you describe some of the main data challenges in an S/4HANA migration?

Ben McGrail, managing director, XmateriaBen McGrail

Ben McGrail: There are two parts to the data challenge that customers are facing with the move to S/4HANA. First, there's the migration itself; and second, they must rework their data for a new clean core world. SAP is pushing hard for this back-to-standards approach, which involves a clean core digital ERP system -- S/4HANA -- and then using SAP BTP [Business Technology Platform] to interact with other cloud services. That means there must be some changes to how data is stored, used and managed.

How do you define the clean core?

McGrail: SAP is saying that many customers are being held back by a mountain of custom code that they've developed. For example, you've got a 20-year-old SAP system that you customized heavily at the start and then, over the years, added to, badged on or built something else. It's out of control. If SAP's long-term objective is to get people onto the public cloud, which is like a canned model that has little flexibility or customizability, you need to get rid of that custom code. They're encouraging customers to have a much simpler S/4HANA system by losing this custom code to either get back to a standard process or replace those customized processes with something that sits outside of SAP. Once you've got customers working in a much simpler, standardized way, it's an easier journey to get onto public cloud. Then when everyone's on the public cloud, customers can access [new capabilities] on a monthly or quarterly basis as you would if you had something like Salesforce or HubSpot.

What are some of the complicated factors in making the system simpler?

McGrail: [A simplified system] means there's less data sitting within SAP, but it also means data may need to be managed in different environments. At the moment, a customer might run 80% to 90% of its processes on SAP. But if you [simplify the core], then you've got multiple places where you're working. You've got your hub, and you've got various other applications that sit around the edge that you're either accessing through the BTP or through one of the cloud hyperscaler connectivities. It's a different [strategy] from the way that data is organized and managed now. There's a general data management challenge as customers move to a more standardized, simplified core with BTP.

How complex is a typical SAP data environment?

McGrail: A customer that only runs SAP financials knows what data they've got. They've got customers, vendors, payables, receivables, balances, and profit and loss. But when you get into large manufacturing enterprises that are global in nature and have hundreds of companies, it's much more complex. Some of those companies may have been sold; some of them may have been closed. It's just a monster. If you are looking at moving that to S/4HANA, you must decide what to take, what not to take, what's active and what's not active.

If you've got multiple SAP systems and you're looking to bring them into one and want to compare the master data across, you've got a data quality challenge. You might be using the same supplier 15 times across four systems, but you only want to take one of those across to the new system. In those more complex cases, there's certainly a big data discovery challenge.

How important is it to manage data quality, especially as AI becomes more prevalent in SAP applications?

McGrail: All in all, whether it's for AI or analytics, if the underlying data is wrong or incomplete, it's going to affect the outputs that you get. When they talk about the clean core, it's more around simplifying how you use your system rather than actual data quality. But if you're moving from ECC to S/4HANA, it's a once in a generation chance to clean your data. It's much harder to do it in system. People like to use a moving houses analogy, which is a bit facile. But at the same time, you can understand that if you're moving houses, it's a good time to clean up and not take some stuff with you. Once you've moved, you're not going to look at it again for a couple of years.

Jim O'Donnell is a senior news writer who covers ERP and other enterprise applications for TechTarget Editorial.

Dig Deeper on SAP data management