Skip navigation EPAM

Is RPA overhyped, scalable or a bandaid? Are decision engines next?

In the News:

Information Age

Is RPA overhyped, a scalable solution or a bandaid? Are decision engines the future?

You hear it often enough. RPA is overhyped. It is like a bandaid. Now Information Age has spoken to one business consulting expert who has argued that decision engines rather than RPA is the future. Is he right?

For Albert Rees, who heads up business consulting for North America, for EPAM, RPA is more like a bandaid.

Rees sees two issues with RPA.

  • implementation costs are much higher than they anticipated.
  • maintenance costs. “You’re moving data between existing systems and if those systems don’t ever change, all is good in the world. But that’s not really the case. So as systems evolve, things at the back end have to change as well.

Rees says: “You have to get to a pretty low level of detail to really understand where automation is going to work and what it’s going to do and how it’s going to interact with systems.” Getting to that low level of detail carries a cost too.

He adds: “We see people copying in and out of spreadsheets, you go to call centres, you see the three-screen scenario where they’re moving data from screen A to screen B to screen C and it’s just three internal systems that they’re moving information around and one of them might be Excel. If anything changes on any of those and suddenly you’re having to go back in and implement an update. Most organisations can’t do that themselves so they’re back spending money again.”

Decision engines

Instead, he sees decision engines as the key. But what are they? He said: “I think of Decision Engines as tools used to automate some portion of human decision-making that can be defined by a set of rules. They work by applying predefined business rules (often referred to as ‘decision trees’) to data sets. Decision Engines are not typically a replacement for RPA, however they can enhance RPA when a process requires a decision be made when evaluating complex rules applied to complex data sets.”

The debate

So, is that right? Recently, Information Age spoke to Bruno Ferreira Managing Director UK & Ireland at UiPath, not surprisingly, he has a different perspective.

The hype argument, he suggests, makes no sense. “Our renewal rate is more than 90%.”

He also refers to UiPath’s growth: $10 million in 2016, $200 million in 2018, suggesting that the hard numbers indicate a ferocious appetite for RPA. And yet, he suggests only a small number of organisations are using RPA, implying massive potential for growth.

Sarah Burnett, executive vice president and distinguished analyst at Everest Group says that “we have actually done studies that show companies achieve about 30% cost savings.”

Despite talk that RPA is over hyped, Sarah Burnett, a guru on automation technologies, says that it can cut process costs by 30%, but there is another benefit, not so obvious, and it lies with creating data we can trust.

The RPA bandaid?

Then there is the issue of RPA being like a bandaid. It boils down to legacy systems. “They’re written in COBOL, they’re sitting on mainframes, and that’s one use case where we’re unable to do anything with those legacy systems any more says Rees. He then argues that many of the systems are not well integrated — “we get multiple versions of the truth, and that means an awful lot of mundane manual work, keying in data, for example, that already exists in another system.”

He suggests a number of companies are using RPA to help with this, but that “in a lot of ways, RPA is a bandaid for things that could exist, capabilities that could exist within ERPS and other best-of-breed type solutions today.”

“The question in my mind, he says “is how long does RPA really stay as a viable solution in the marketplace. And that I don’t know. I’m speculating somewhere between three and five year, as what we are really seeing is customers moving towards intelligent automation and cloud solutions, which are really the next evolution of RPA.”

Here again, there are those who strongly disagree. For example, UiPath has famously stated its aim to achieve a robot for every desk, an aim which seems to have long-term planing implicit to it.

Bruno Ferreira cites as an example, UiPath’s target of their being one robot per scientist. “Imagine that,” he says, “every day a scientist goes to the lab, but has to spend an hour a day in preparation,” with RPA he says “estimates suggest a saving of nine hours a month, which would be amazing.”

According to Ollie O’Donoghue, research VP, IT Services at HFS Research, when non-IT departments start introducing RPA to their enterprise, they tend not to integrate it with the wider IT infrastructure. In many cases, they’re just extending the life of an old legacy system, which means there, essentially, just buying themselves time. In other words, RPA introduced under these conditions will act as a sticking plaster.

Sensitive data

Rees also has concerns with unattended robots, that could automate a process end to end. Part of the issues lies with regulation and the danger that unattended robots could be repeating an error, that operators, divorced from day to day operations, are unaware of.

“Everybody wants to be agile,” he says and “agile is great from a development perspective when you can afford to make mistakes and learn from the mistakes and correct. But you can’t do that with sensitive data.

So, when you’ve got robots moving data from system A to system B sitting behind a firewall, unattended works great. If something goes wrong, guess what? We throw on another sprint, fix it, and we’re back in unattended again. Typically it gets caught before it ever gets into anybody’s hands, where it’s going to create a problem.”

He sees a problem, however, with sensitive data. For example, if you get the “bank statements for someone else’s checking account. “In these situations unattended become much, much riskier.”

Sarah Burnett, by contrast sees achievement accuracy of data as one of RPA’s strengths.

As they say, to err is to be human, RPA can eliminate error.

“If a person enters a bit of data, they can so easily accidentally switch some numbers. Fixing that error, the further down the process it travels, the more expensive it becomes. You could be having people, whose time is very expensive, chasing this error and trying to fix it, a long way down the process swim lane. Robots if they’re developed correctly and are maintained and run smoothly, will not make those mistakes. We are hearing from organisations that say they kept testing their robot apps and they never found a mistake once they’d tested it, it was 100% accurate …always.”

The original article can be found here.