olly - Fotolia
Survey says: ERP changes, more human-machine interactions coming by 2030
A Dell survey finds strong belief that humans and machines will work as integrated teams within five years. ERP systems, especially inventory, will be top automation target.
By 2030, a major portion of ERP-related work may be handled by machines. These systems will increase in capability as the amount of data grows and as AI advances. Human-machine interactions will play a major role in business, and well before then.
The importance of human-machine interactions to business was ranked very high by the respondents participating in research by Dell Technologies and the Institute for the Future. The report is based on a survey of nearly 4,000 business leaders. More than eight in 10 (82%) overwhelmingly agreed that they "expect humans and machines will work as integrated teams within their organization inside of five years."
Further out, by 2030, smart machines will play an important role in ERP. Three of the top four functions that will be offloaded to machines, this survey found, are ERP-related: inventory management, financial administration -- invoicing, purchasing orders, etc. -- and, in fourth place, logistics. Troubleshooting was number three.
But overall, there is a lot of uncertainty about the technological future.
When asked if "automated systems will free up our time," the response was split down the middle, with half agreeing and the other half disagreeing.
The answers also indicate questions about capabilities. For instance, respondents were asked whether "technology will connect the right person to the right task, at the right time." Only 41% agreed, and the remainder disagreed. Respondents were evenly divided around this statement: "Not sure what the next 10-15 years will look like for our industry, let alone our employees."
In an interview, Danny Cobb, Dell Technologies corporate fellow and vice president of global technology strategy, discussed human-machine interactions and other survey findings.
Cobb sees a wide range of qualitative and quantitative processes and technologies -- AI, context and pattern recognition, voice and image recognition -- gaining enterprise use. His responses were excerpted and edited.
In 2030, more and more tasks will be offloaded to machines. Three of the top four are ERP-related: inventory management, financial administration and logistics. What does this mean?
Danny Cobb: It's hard to imagine that that's the first thing someone thinks about [inventory management, financial administration] in their digital transformation agenda, but it also paints a picture: We're not as far along or sophisticated as we may think we are if those are still some of the topics that come up.
Does this mean that things like inventory management will be more automated? That something like image recognition might be used to track product as it moves through the supply chain?
Cobb: That's right. You see image recognition, drone technology and robotic technology assisting with that function. You see maybe more global logistics functions that might be operating in a hybrid cloud or a multi-cloud way that gives a broader insight into all the inventory and material capability of an enterprise, 24/7 and around the globe.
ERP systems will be handling a lot more data from a much wider range of sources. What do those systems begin to look like in the future?
Danny CobbCorporate fellow and VP of global technology strategy, Dell Technologies
Cobb: At the edge of an enterprise -- the edge being wherever the first unit of intelligence begins to exist -- there might be a stream of telemetry. It might be all this inventory data. It might be all the input from these drones, or from a global logistics system, or from multiple systems because of supplier-to-supplier linkages. These systems now need to be much more intricately linked than ever before. There is an opportunity for an entirely new platform to come into existence -- the intelligent edge of the enterprise that handles this telemetry, that handles any of the immediate compute or storage needs. It takes that information and shares it appropriately with a core data center that might contain additional intelligence from the rest of the enterprise. The edge technologies do the first stage of work, and then, those migrate upstream to a set of core technologies that are responsible for further analysis, long-term storage or broader distribution.
How much data will we be getting from these alternative sources, and what are the challenges to processing it?
Cobb: Artificial intelligence, machine learning sorts of capabilities are going mainstream because the amount of compute that we have has caught up or is catching up with the amount of useful data that's there to be analyzed. Other instrumented systems [such as autonomous vehicles, building automation systems, jet engines] are throwing off a tremendous amount of data, and we can now afford to process it as it's being generated. We can now embed processing in just about anything.
A high percentage of those surveyed for this report expect to see more human-machine interactions by 2030. What does that mean?
Cobb: It may not be strictly a physical presence -- a personal robot sitting in that room with me -- but artificial intelligence itself will complement the team's function and will provide a useful value. It's that sort of digital partnership.
How useful is it to think about the world 15 years or so from today?
Cobb: They [customers, users] need to start getting a blueprint that helps them address some of these opportunities or manage some of these risks. What research like this does is to give customers a vehicle for thinking about this. What are the new roles that are going to be created? What are the skill sets that need to come into existence? How might that impact job satisfaction?
They realize the pace of change is accelerating, and they're worried about being left behind.