ABBYY
Back to The Intelligent Enterprise

5 Reasons Why Process Intelligence Is Replacing Process Mining—Our Conversation with Doculabs


by Scott Opitz, Chief Product and Technology Officer

“Ad hoc is a part of life. Day-to-day business is ad hoc to most companies—distribution center and warehouse processes might be automated, but why aren’t customer-facing processes? For many organizations, automating these ad hoc processes is the next step. The need to automate ad hoc processes is driven by the modern expectation of ’now!’” - Marty Pavlik, Doculabs

Share

This September, ABBYY was joined by leading developers, innovators, and IT managers in hosting the industry’s inaugural Intelligent Automation Month. This collaborative effort between automation leaders sought to educate organizations on the accelerated value and operational excellence enabled by AI-enhanced solutions like intelligent document processing and process intelligence. Intelligent Automation Month took a dive deep into these technologies, featuring webinars that explored crucial aspects of successful automation initiatives. One such aspect is the need for visibility into an organization’s processes to prevent failure of implementation because of unclear priorities or goals.

I was joined by Marty Pavlik, Generative AI and Intelligent Automation Leader at Doculabs, as well as ABBYY Senior Vice President of Product Marketing Bruce Orcutt for a conversation on the tool that’s giving IT leaders and enterprises unmatched visibility into their operations: process intelligence.

ABBYY’s strategic partnership with Doculabs is driven by the rapid evolution of the process intelligence marketplace. According to Pavlik, process intelligence projects have tripled in recent months, and a tool like ABBYY Timeline is likely to be “part of legacies” for organizations as perceptions of the technology evolve.

Gartner’s 2023 Magic Quadrant for Process Mining Tools also revealed an acceleration in the need for process mining and process intelligence, indicating a visible progression towards maturity. Still, many organizations might be unclear on the difference between these two technologies—process mining and process intelligence.

In the timeline of process analysis and optimization technology, process mining is where it all started. Process mining is very well along in its maturity, having been widely adopted by organizations. Despite its relative recency, it is now recognized as a key category. Process intelligence is defined more broadly, encompassing broader process capabilities such as task mining, simulation, monitoring, and prediction. These additional capabilities are what makes process intelligence actionable, allowing the concept of a “digital twin” to reflect how things will behave in real-world settings.

While many process and task mining tools claim to offer comprehensive solutions, they often fall short in addressing the complexities and nuances of real-world processes. Here are five reasons why organizations are expanding process improvement projects beyond process mining and embracing the broader capabilities that process intelligence solutions offer.

1. To accelerate time to value

Low-code / no-code enables quicker time-to-value with process intelligence solutions. Building solutions that are usable by analysts rather than just developers can make this crucial difference in adoption; it circumvents shortages in coding and development talent, enables those with extensive business knowledge to make meaningful contributions with the tool, and accelerates time-to-value overall by eliminating laborious coding needs.

For IT professionals, low-code / no-code is music to their ears. It gives them more time for the traditional aspects of their work, while bringing crucial business knowledge to the table by allowing those who work outside of development to generate value.

Solving the accessibility issue allows enterprises to start seeing value almost immediately. From ABBYY’s perspective, once we have access to data, we’re disappointed if we don’t see someone start to gain insights in the first day or two. Nobody is an expert on day one—you evolve to use it better and interpret results better. If we can avoid the coding challenge, which makes the learning curve more attainable, you should see results in a few days. If you’re not, something is probably wrong.

2. To monitor processes in real time

Not all processes can be monitored around the clock, even if that’s what organizations want. Most professionals simply don’t have enough downtime.

This is where latency comes into play. In the wake of the 2008 recession, ABBYY worked with a bank that had to prove to regulators that they were handling investment portfolios with appropriate risk management. This process was very laborious, and even then, it couldn’t be 100% checked. Process monitoring, however, yielded immediate notifications. Monitoring is just another form of automation—it enforces a set of rules.

3. To predict and anticipate outcomes

There is one caveat to process monitoring – by the time you’re notified of a variation, it’s already history. In order to have a real opportunity to improve execution, you must know that something is likely to happen before it actually does.

Process prediction gives IT professionals this insight, enabling them to know early in the life cycle that a deadline will be missed, or the outcome will be of some concern. If they know early enough, it can possibly be prevented. In cases of inevitability, the consequence can be mitigated.

Customers might not immediately perceive the potential for value through process prediction, but once more analytics and predictions can be layered, it tends to grow at scale within organizations.

4. To use simulation to inform optimization strategies

Before implementing change, it’s important to recognize and understand how it will affect execution of complex processes across multiple instances. Without some way of simulating the change, there is no method to reliably consider variability.

Process simulation makes the digital twin actionable and mitigates risk in implementing change. If you’re building a jet engine, for example, you can’t afford to blow one up testing it. You need a good likelihood that it’ll work. If you can build a model that represents the true physics of this engine, you can test it under a variety of circumstances.

This offers a competitive advantage to designing processes because you don’t have to wait to see the outcome. This tool is enabled by neural networks, which continue to deliver advanced capabilities with the same ease of use as basic analysis tools. While the sophistication happens below the hood, an easily understood interface on the surface ensures that process intelligence and simulation tools can be set up efficiently.

5. To better manage ad-hoc processes

The growing level of interest shows that tools like process prediction and simulation are accelerating. These tools enable an ability that is particularly crucial to adoption—handling ad hoc case management.

“Ad hoc is a part of life. Day-to-day business is ad hoc to most companies—distribution center and warehouse processes might be automated, but why aren’t customer-facing processes? For many organizations, automating these ad hoc processes is the next step. The need to automate ad hoc processes is driven by the modern expectation of ’now!’”

Marty Pavlik, Doculabs

Think about the many interactions on any customer journey, especially with multiple steps, contact points, and of course, documents. These cases are overwhelmingly common and can be considered case management by nature. Some organizations seek a mythical “happy path,” or an approach to a process that is universally efficient and consistently optimal. Unfortunately, this simplicity is impractical; enterprises have to handle a great degree of variability while still delivering the necessary insights.

As customers’ expectations for speed of service continue to rise, timeliness becomes more imperative. Even thirty minutes of latency can feel like an eternity, prompting enterprises to push toward real time. Time to value is critical.

ABBYY Named a Leader in the 2023 Gartner® Magic Quadrant™ for Process Mining Tools

News
Learn more

Accelerating process intelligence with generative AI

There is one more elephant in the room when it comes to process intelligence, and our Q&A session with webinar attendees ensured it wouldn’t go without being acknowledged.

Generative AI has obviously swelled in popularity and intrigue, even finding its way into cocktail parties and informal gatherings as a common discussion topic. Now, when decision makers are considering adopting AI-driven solutions, a natural and important question to ask is, “how can generative AI enhance this?” Our webinar attendees were no exception.

As solution providers, we’re seeing demand in two main areas: enterprises want to serve their employees better, and they want to serve their customers better. As generative AI matures, incorporating it into process intelligence platforms is going to drastically accelerate these outcomes.

By marrying structured and unstructured data, combining these technologies expands the utility of the digital twin model. The ability of large language models (LLMs) like generative AI to navigate vast quantities of unstructured data reinforces the digital twin’s ability to understand and contextualize the insights it yields. It will theoretically be able to answer more specific questions and find patterns in data that aren’t always obvious or digestible even in intuitive visual formats.

Generative AI will apply at varying degrees to each project, and decision makers should be wary of the excessive hype—but when the value becomes undeniable, the last thing you should do is let your organization fall behind.

Access the recording of this session from ABBYY’s Intelligent Automation Month here.

02a-ScottOpitz-99x99.png

Scott Opitz

Chief Technology Officer at ABBYY

Scott Opitz is Chief Technology Officer and responsible for the global product strategy and execution. Prior to this, he held the position of Chief Marketing Officer, driving global marketing strategies. He joined ABBYY with the acquisition of TimelinePI, for which he was co-founder, President, and CEO from its inception. In this role, he oversaw the integration of TimelinePI’s process intelligence products (now ABBYY Timeline) into ABBYY’s worldwide sales and distribution channels.

A 30-year veteran of the computer industry, Scott has founded and built companies in the application integration, business process management, and business intelligence spaces. Scott founded and served as President and Chief Executive Officer of Altosoft Corporation, a business intelligence software company that was ultimately acquired by Kofax. Following the acquisition of Altosoft, Scott served as Senior Vice President & General Manager for Analytics.

In previous positions, Scott served as Senior Vice President, Worldwide Marketing & Business Development for webMethods (now Software AG), where he was responsible for webMethods’ marketing, business development, and strategic product planning. He joined webMethods as a result of its acquisition of IntelliFrame Corporation, a provider of data integration products and the InVista integrated workflow and BPM platform that he co-founded. Scott has also held numerous executive-level positions in technology, marketing and business development roles for public and private companies.

Subscribe for updates

Get updated on the latest insights and perspectives for business & technology leaders

Loading...

Connect with us