Decision Management Drives Disruption

RESEARCH REPORT // FREE
 
As processes, apps and resources get smarter, the way in which we make decisions, and manage decision-making, becomes ever more important. This is especially true of industries where the fundamental processes change relatively little, yet business policies and regulations that affect those processes change much more rapidly. The need for greater agility has led to a resurgence in the search for methods that help businesses first visualise, and then execute, their ideas. In parallel, the standards related to business processes have taken a big step forward with the Decision Management Notation (DMN), which has wide implications for the new era of business automation. There’s a battle shaping up between established incumbents and this new execution era enabled by the DMN standard and its underlying expression language.

Top takeaways

Managing decisions, from visualisation to execution

Decision modelling, and the relatively new decision modelling standard DMN, have the potential to drive radical change in how you visualise, manage, control and automate aspects of your business.

With decision modelling, you now have way of visualising the future – using models to express strategies, goals, processes, policies, rules and constraints – and then driving those visualisations all the way to execution. As we will see, this is already happening. However, there are significant challenges ahead.

DMN’s impact is potentially very significant, but FEEL is key

The DMN standard sets out to enable business people with a way of interacting with, and describing the rules and constraints that they want in the world of systems implementation. It also enables the rapid development of reusable components for that world.

What’s fundamentally different about the DMN standard is that the standardisation and accessibility of the notation is enabling a new wave of innovation; a wave that has managed to overcome many of the limitations of previous attempts to move toward model-driven applications.

However, that promise only comes to reality with full support for FEEL level 3 compliance. An open source execution engine for Level 3 compliance is available as a part of Red Hat’s implementation for DMN. While FEEL support may appear challenging to implement, it is certainly possible with a few months of programmer time.

Vendors are squaring up for a decision management battle

As with all standards, there are different interpretations of the way ahead. Incumbent vendors want to shape the narrative around the standard to favour their strategies – to disadvantage key competitors, or block the opportunities for new disruptive products to emerge. Each organisation looks at the opportunity (and threat) posed by the DMN standard through its own perspective.

The potential to drive radical change

Decision modelling, and the relatively new decision modelling standard DMN, have the potential to drive radical change in how you visualise, manage, control and automate aspects of your business. This change may take many forms including:

  • Decision modelling that shapes strategic thinking, for example:
    • As an engagement framework to align digital transformation efforts.
    • To create optimisation frameworks that resolve intractable/wicked problems.
    • To clarify, integrate and complement government regulations.
    • To help shape governance practices within the firm.
    • Shaping business analytics, metrics and the structure of business dashboards.
  • Decisions-as-a-Service (DaaS) that:
    • Package knowledge for sale, and on the fly analysis of data.
    • Animate “what if” scenarios against partial sets of input data.
  • Decision Services, as an enabling component of automated process execution, that deliver:
    • Rapid reconfiguration of outcomes in line with changing policies and regulations.
    • More effective and efficient use of process automation and RPA technology.
    • Optimisation using analytics and the definition of constraints, leading to more effective Next Best Action and Prescriptive Analytics.
    • Intelligent monitoring of IoT device data, leading to better predictive maintenance; even shaping how IoT devices generate and communicate that data.
  • Support for software bots and AI/ML technologies by:
    • Providing frameworks that help create training/learning mechanisms.
    • Defining constraints within which these environments operate.
    • Creating after-the-fact explanations for opaque AI recommendations/decisions.

Business now has way of visualising the future – using models to express strategies, goals, processes, policies, rules and constraints – and then driving those visualisations all the way to execution. As we will see, this is already happening. However, there are significant challenges ahead. In the end, advances in decision management means dramatic changes to:

  • How the business develops its strategies and tactics.
  • The vision for digital transformation and the design of appropriate systems architectures.
  • The roles and responsibilities of organisational governance.
  • The process architecture used to drive work, and “How things get done around here.”
  • How applications and systems are developed and maintained; indeed, the role of IT.

The reality is that many organisations are ill prepared to handle these changes.

A brief introduction to decision modelling and DMN

There’s nothing especially new about the desire, when designing systems, to separate processes from decision logic, data management and user interface definitions. Savvy application designers have always sought to reduce the amount of coding required. We’ve long understood the need for this separation of concerns – the difference is that we can now separately manage the lifecycles of these distinct types of artefacts.

Driven by the maturation of process modelling notations such as BPMN, vendors and consultants came together under the auspices of the Object Management Group to create a standard decision-modelling notation. The “Decision Model and Notation” (DMN) was published by in September 2015, with the current version DMN 1.1 released in June 2016. Most importantly, the standard incorporates compliance levels to facilitate interoperation and transport of definitions between products. Work is currently ongoing to finalise version 1.2, with several vendors already talking of support.

The DMN standard builds on a wide body of research and experience in building effective business rules environments. It emerged from a collaboration of key vendors and consultants active in the space – particularly FICO, Decision Management Solutions, Trisotech and Sapiens DECISION.

“The primary goal of DMN is to provide a common notation that is readily understandable by all business users, from the business analysts needing to create initial decision requirements and then more detailed decision models, to the technical developers responsible for automating the decisions in processes, and finally, to the business people who will manage and monitor those decisions.”

DMN sets out to bridge the gap between business decision design and decision implementation. It’s designed to work alongside BPMN and ensure decision models are interchangeable via XML. The specification defines a set of elements (types) and their connection rules, and a corresponding notation: the Decision Requirements Diagram (DRD). For decision logic modelling, it provides a language called FEEL (Friendly Enough Expression Language) for defining and assembling decision tables, calculations, if/then/else logic, simple data structures, and reuse of externally defined logic from Java and PMML. DMN also provides a notation for decision logic – Boxed Expressions – allowing the graphical representation of decision logic in “Decision Tables” (DTs) and associated with elements in a DRD.[i]

The important concepts in DMN are:

  • DRDs and DTs are graphical ways of expressing decision logic based on defined input data. Robust DMN models are more than just requirements specifications – they are the real thing! Anything you can model and validate, you can execute. The standard sets out precise definitions for both modelling and execution levels.
  • Boxed Expressions provide tabular structures and a graphical format. This core functionality replaces a traditional programming language with a tabular form and an accessible expression language – Friendly Enough Expression Language (FEEL). S-FEEL is a simple subset of FEEL.
  • Business Knowledge Models (BKMs) support reuse through encapsulated logic. Effectively, they become linking mechanisms for DRDs; BKMs allow the modeller to aggregate models together to handle and hide complexity, simplifying a DRD to contain the essential elements of the business domain. In other words, BKMs provide a container for knowledge and decision logic. What’s more, with a simple DRD to access it, you can immediately turn a BKM into an executable and published service!

From the ground up, FEEL and Boxed Expressions were designed to work with BKMs. FEEL provides a standardised way of hiding the underlying complexity. Without full FEEL support, there’s little prospect of a truly portable and executable language designed for non-programmers.[ii]

Clear decision models drive alignment

Although many see the opportunity for decision modelling principally as a way to radically improve automation, decision models exist at many levels and serve different purposes – addressing strategy, tactics, business operations (several forms), governance, regulations and compliance, and even change programmes themselves, to name just a few. Visualising those different use cases is challenging, with most businesses still relying on “methodless” PowerPoint or Visio diagrams to create alignment and clarify a common language. Moreover, when organisations seek radically different operating models, they often struggle to see the wood for the trees. Decision modelling can also help tease out better governance practices.

While there are other aspects of organisational alignment, during our research we uncovered several instructive examples where decision modelling helped clarify the need for data and/or was used to build internal collaboration and cohesion:

  • The New Zealand tax authorities used decision modelling as a transformation enabler. Traditionally, they had focused on product-centric traditional risk management strategies. They wanted to become more customer-centric, intelligence-led and analytical. They wanted to move Goods and Services Tax (GST) assessments from a more reactive risk identification and risk measurement focus, toward one based on profiling types of risks and customers. This would have wide implications for other forms of taxation and require cross-government collaboration. By adopting a decision-modelling framework based on DMN, they improved clarity, understanding and thinking. They also improved communication, documentation and overall agility.
  • Liquidity risk profile reporting project within a financial services organisation. The team, split across four countries, was trying to interpret a 450-page specification from the regulator. Although they all worked for the same organisation, team members lacked a common language. Within a month, a careful examination of the specification using the DMN decision modelling standard had enabled all team members to engage – collaborating and even enjoying the process – as they clarified the detailed data requirements. It’s interesting to note that they found that virtually all the existing solutions to the problem, and the requirements, were expressed in complex Excel spreadsheets.
  • Simple business rules and decision framework helped create a common language and approach to optimise traffic flows in the Netherlands. Dealing with traffic congestion in densely populated areas is challenging with the highways authority, municipalities and provinces all having their own approaches, ideas and language. The underlying problem is relatively complex because distances between junctions and interchanges are short and speed limits are not uniform. Through a series of workshops, representatives of the different parties identified a common vocabulary and approach for network-based traffic management. The approach disentangled problem detection, problem solution and conflict handling in the road network. Each element in the road network follows the same generic rules and defers traffic problems onto other roads with lower priority. In this way, traffic is distributed across the road network and congestion eased.[iii] The approach is a good example of how explicit and simple decision tables have helped organizations to collaborate more effectively.

Based on these examples, it’s clear that change management teams should consider adding decision modelling as a key technique for change initiatives. Decision modelling helps create alignment through a neutral framework that:

  • Encourages alignment between disparate groups.
  • Clarifies data handling needs.
  • Promotes the development of a common language.

Ensuring traceability between business intent and execution

Clarifying decision models simplifies and elucidates business intent, which in turn, offers the chance to ensure fidelity in execution and many opportunities for optimization in business operations.

Separation of concerns between upstairs and downstairs

One could think of the clarification process as occurring “above the floor,” while the automation and optimisation action is happening “below the floor.” Historically, developing meaningful systems (the wiring and plumbing below the floor), involved:

  • IT analysts who interviewed business people about how they did things. They wrote specifications and user stories with varying degrees of clarity to reflect the plans for differences in working practices, and the new vision for serving customers better.
  • Using a black art called “programming.” The IT function later translated these specifications into applications that provide the wiring and plumbing to support the work of the organisation.[iv]
  • Specialist human resources to both develop the systems, and to plan and govern changes to them. Heaven forbid that the knowledge of what’s in these systems should retire or leave the firm … er, hold on.

And that’s not the full extent of the challenges. In most modern organisations, the traditional levers provided to business people to flex their operations in near real time have been, until now, relatively rudimentary. Moreover, look below the floor and you will find “spaghetti junction.” Nobody dares change anything because of the apparent complexity. For any change, it’s back to IT for modifications.[v]

Bridging the abstraction gap

Business people need the ability to take things apart and put them back together. What has changed is that organisations now have accessible methods for describing business intent above the floor that enables them to design and govern how work happens below the floor. The underlying challenge here is a human one; the large majority of people just cannot think abstractly. Another analogy demonstrates the point. Most understand Excel at a usage level and can construct simple spreadsheets. Over time, most business people work out how to use a few Excel functions. Yet, very few ever get to grips with developing Excel macros, which is quite clearly, programming.

Dealing with the sophisticated capabilities of modern process modelling tools is a similar story. Even with training, many business people simply glaze over when asked to consider the detailed process models needed for meaningful process execution. This is where decision management – or more specifically, decision modelling – comes in. Apart from complimenting process modelling, decision modelling sets out to:

  • Facilitate the development of better business requirements. When business people can better describe their needs – developing thinking, clarifying language, and formalising definitions of business functionality – then the organisation derives significant benefits.
  • Provide the equivalent of Excel functions. Think of it as a mid-point in the abstraction scale – something that is sophisticated enough to enable people to achieve their goals in an elegant way, yet simple enough that, with very little training, business people can understand and use the features to reflect their needs.
  • Enable some business people to span the above-below floor challenge. It would allow them to express their goals and decision-making logic in a precise and unambiguous way, such that computers could execute that logic without direct translation and reinterpretation by programmers.
  • Deliver a way of teasing apart and encapsulating the underlying complexity. It helps clarify the need for supporting data, providing an opportunity to simplify complex inter-related issues, and deal with the spaghetti below the floor. Indeed, with appropriate methods and models, it becomes possible for business people to not only package discrete business capabilities, but also control what is happening in the execution domain – without resorting to programming.[vi]

Treating decisions as first-class citizens enables better service architectures

Another important aspect of robust decision models is the ability to support better and more effective execution solutions. It’s here that big opportunities exist to create organisational value. With clarity of purpose, it’s possible to connect and integrate disparate parts of the organisation, leading to better experiences for consumers and citizens. Ultimately, it also creates a more agile organisation.

To comprehend how this is playing out, one only needs to look at the challenges faced by governments – whether at the national (federal), county (state) or city levels. In these scenarios, there are many situations where complex and rapidly evolving policies overlap. For example:

  • The government of a major European economy is in the process of integrating all the regulations related to the environment and water management. Think sewage, water treatment, canals and dikes, roads, construction, agriculture … and you get the idea. This department is merging 26 distinct sets of regulations into one. A key goal is to provide citizens and businesses with a comprehensive set of guidance and advice through a central portal. It will also provide the infrastructure for all the water boards and municipalities to create their own implementations of the regulations to meet the specialised needs of their customers. That portal will almost certainly leverage open source servers such as those provided by Red Hat or Camunda.
  • Benefits eligibility must factor in multiple overlapping policies.[vii] In this working example, decision management models integrate with a modern BPM suite and Robotic Process Automation (RPA) technology to support the adjudication process for tens of millions of citizens each year. The system, developed by a major system integrator to support the adjudication process of the US Affordable Care Act (otherwise known as Obamacare), receives applications that trigger the creation of long-running cases in the BPM suite. To gather supplementary information about the claimant, decision logic modelled in a separate environment identifies which legacy systems to access. RPA software accesses those systems, with the results assessed again by another set of decision logic to determine both eligibility and the identification of the appropriate caseworkers to handle the claim. The BPM system then surfaces the work item to the relevant caseworker along with a recommended determination and probable outcome; supplemented with summary information listing the applicable policies and their precedence, along with an “audit package” containing all related decision logic. All the relevant policies and exceptions are modelled independently of the processes. A service-level interface makes these models available to the BPM suite, which now follows a relatively simple process. When policies governing eligibility change, business administrators tweak decision models rather than rewriting code, or rerouting processes.

Decision management services, and the models that support them, are the enabling components of these two examples. The key takeaway here is that decision management services have to interoperate with other common components. When that happens, a certain degree of disruption occurs within the prevailing systems architecture of major corporations. Instead of endlessly programming changes, business people will just reconfigure decision models and processes.

However, this requires a change in the mindset of the enterprise/systems architecture community for these sorts of architectures to come into reality. As shown in figure 1, “decision management as a service” fits neatly into the automation architecture required for digital transformation.

Although we have not reviewed products that are currently integrating decision management with AI or machine learning (ML), the opportunity is clear. Decision management has the potential to provide frameworks that help create training/learning mechanisms for AI/ML. There’s also a possibility of using decision management to set constraints within which these environments operate. Finally, where an “opaque” AI has made a recommendation, decision modelling provides a way of clarifying how that decisions was made, citing the correlations, conclusions and policies that led to the recommendation.

Similarly, decision management has the capability to transform the Internet of Things, providing model-driven ways of articulating the behaviour of systems. It’s also possible to imagine decision management capabilities built into the design environments that create the sensors themselves, allowing more intelligence at the edge.

 Figure 1: Decision management acts as a key component

Source: MWD Advisors

Vendors square up for the battle of decision management

As with all standards, there are different interpretations of the way ahead. Incumbent vendors want to shape the narrative around the standard to favour their strategies – to disadvantage key competitors, or block the opportunities for new disruptive products to emerge. Each organisation looks at the opportunity (and threat) posed by the DMN standard through its own perspective. Some see:

  • The creation of new markets supported by a broad ecosystem of collaborations. We now have multiple tools available for modelling DMN, with varying degrees of support for the standard. We also have open source support for the direct execution of these models.
  • A sales funnel for their established offerings. What once required programming is now accessible to the sophisticated business user through the addition of a DMN modelling tool and/or appropriate import interfaces. This increases the opportunities to utilise existing tooling built into current offerings – effectively growing the product footprint.
  • A way of enabling a market for knowledge content and intellectual property. Rather than just the technology vendors, all sorts of organisations now have the opportunity to componentise their IP and offer it for sale via standard, executable hooks.[viii]
  • The need to rebrand or reframe existing functionality to claim compliance with the standard. By cherry picking which bits of the standard they support, a vendor can talk up their support without any attempt to embrace model portability. Think of it as the ability to put a tick in the box on RFP documents.[ix]

The three core elements of a decision management solution involve modelling, repository management and execution. Vendors have taken a different stance in each area.

Modelling tool vendors are embracing DMN

At the time of writing, we have recently published detailed reviews of the decision modelling capabilities of both Trisotech and Signavio. Both of these vendors have integrated DMN with BPMN modelling, and Trisotech has integrated it with their CMMN modeller.[x] Other vendors also provide decision modelling within their suites (covered below). Sapiens DECISION also provides modelling capabilities although at the time of writing this limited to its TDM approach (we understand a DMN modeller will soon be available). We are also aware of DMN modelling tools from Sparkling Logic and ACTICO.

Although not technically a part of the standard, virtually all the products provide some degree of model repository capability. Quite apart from the ability to store and control access to model definitions, key elements of the repository include:

  • Dictionary/glossary functionality. Used to ensure that the “language” is consistent across models, there is quite a range of functionality required here. Some take a relatively simplistic unidimensional view of the dictionary, while others set out to keep track of the subtle differences in meaning across business units or geographies. Some organisations will want to ensure that these semantic definitions are synchronised with other environments such as enterprise architecture modelling repositories and master data services.
  • Versioning of model objects and the elements within them. Clearly, process and decision models can have quite different lifecycles. This can get very complex; it’s often important to track which decision model version was used on a specific case. Associated with this is the need for governance and control over those models and related artefacts, ensuring that the correct models are deployed together, and that the right roles have signed off changes. Organisations will probably want the tooling to support their specific requirements rather than rely on a single generic authorisation process.
  • Service level interfaces for external applications. At the simplest level, export of models might suffice, but most organisations will want to access decision models as a service, with the result served up to third party applications. With a robust decision model interface, it’s possible to allow the same policies to govern both process routing and user interface elements within an application. It’s here that conformance to the DMN standard comes into play. While proprietary modelling extensions to the standard might deliver some short-term advantage, longer-term portability between environments becomes the dominant factor.

Execution tools represent the key battleground

It’s within the execution arena that the real battle for decision management is shaping up. On the one hand are the vendors that recognise that, almost by definition, robust decision models become executable without further translation. This is the ecosystem approach. Vendors that have embraced that vision from the start include:

  • The open source community. By relying on external modelling tools with full compliance to the standard (such as Trisotech), these vendors can dramatically increase the footprint of their typical users. They’re no longer constrained to only supporting programmers, and can now provide direct support to ‘citizen developers’ and sophisticated business users. What’s interesting here is the speed with which open source offerings have developed in the decision management arena. Today, open source vendor Red Hat already offers direct execution of DMN models at conformance level 3 (full FEEL support) within its Community Edition of the DROOLS product set.[xi] We understand that Red Hat intends to roll out support for DMN within its Enterprise Edition by the end of the year.[xii] This will really put the cat amongst the pigeons.[xiii] Open source vendors Camunda and Edorasware (Flowable) also offer DMN decision table modellers as a part of their BPMN based workflow suites. Edorasware also has an API enabled DMN execution engine, with a DRD modeller due in a few months.[xiv] Given that these two vendors typically target the software development community, it’s not surprising that they have yet to implement FEEL within their engines.
  • Oracle Cloud Apps suite also supports full FEEL and conformance to DMN level 3. The DMN tooling works seamlessly with Oracle’s BPMN 2.0 modelling tool to enable the configuration of Oracle’s range of business applications. We understand that this suite will soon fully support both import, and perhaps more importantly, export of DMN XML models. The only real downside of this is that the execution is limited to Oracle’s Cloud. Customers will need to embrace the fuller Oracle offering for this to be economically viable. We would love to have seen DMN and BPMN execution offered at a much lower price point from the rest of the Oracle Cloud Apps suite.

At the other end of the spectrum are those that are happy to import DMN models and then translate them into their own proprietary execution languages. For them, DMN is all about better-defined requirements.[xv] These include:

  • IBM with Decision Composer. We understand that IBM has developed a tool called Decision Composer, which is currently offered as an “experimental service” on Bluemix and accessible from the Business Rules service of IBM Cloud. At the time of writing, it’s not clear how IBM will take its decision management capabilities to market. As far as we are aware, IBM has not implemented FEEL support, instead importing directly into its Operational Decision Manager (ODM) product.
  • FICO offers a free standalone DMN modeller that can then import into the FICO suite. The FICO Decision Management Suite comprises authoring components (including analytics, decisions, and optimisation) with application development and productivity tools. FICO, best known for its FICO Score (credit scores), provides a strong example of the ability to package knowledge, building a sustainable business model as a result.
  • Sparkling Logic and ACTICO also seem to have broader decision management suites. As far as we can tell, they also import DMN models into their proprietary suites. We have not reviewed or explored either product in any meaningful way.

A few other vendors have implemented support for DMN decision tables within their product set. One example is Appian, which already had a useful, but proprietary, expression development and execution capability.

With all of these different flavours of specification implementation, one might ask, “What’s the point of a standard if everyone implements it differently?”[xvi] This has led to an important effort to resolve and clarify conformance to the specification – The Technology Compatibility Kit (TCK).[xvii] The DMN TCK is an open source effort to engage the vendors into implementing the DMN standard consistently. The group have developed a number of neutral tests that conform to the standard. Vendors can use these tests to check conformance and publish their results.

What it means for business

The effective fusion of business control and automated execution has all sorts of implications for business – especially around governance, innovation and agility. If a business is able to understand decision management understood “above the floor”, and effectively facilitate it in the infrastructure foundation “below the floor”:

  • Rapid evolution and rollout of business functionality becomes the norm. We no longer need highly controlled application rollout procedures, with endless regression testing to ensure that applications work appropriately. Putting it another way, instead of relying on finely crafted and tested programs – there’s no such thing as a small change in software – firms can deploy a robust decision service that executes the logic that was modelled. Models now evolve without requiring a change in software.
  • Governance now sits more firmly in the business domain. Business people can develop and validate the policies that make sense to them. Business people clarify their business intent through models rather than incomprehensible (to them) specifications. Of course, this means the development of sophisticated governance practices on the business side.
  • The cost of ownership of business applications falls dramatically. As the business takes direct control, the costs of deploying effective technology drops. Fewer cycles are required to achieve the same result. Subtly differently, because these models execute without translation, the need for expensive and tightly integrated software stacks disappears.
  • Innovation capabilities are enhanced. With a relatively direct line of sight between business intent and execution, the ability for the organisation to pivot and transform itself is also radically improved. The overall velocity of the enterprise is profoundly changed.
  • Businesses can protect their investments in developing decision policies. With a portable standard, the investments made by organisations become far more future proof than today’s current approach. Moreover, the training of people becomes consistent and reusable.

Challenges on the road ahead

However, achieving these goals implies that:

  • Firms need to develop new methods. Just as developing robust process architectures requires the interaction of multiple process components, so decision models need to work together to handle repetitive actions, lists, and recursive behaviour. Subtly different, firms need to develop new competencies to leverage the power of the notation and automated execution.
  • Software architects follow a different pattern and attitude when designing systems. Rather than relying on the static configuration of standard amorphous applications, decision management offers the chance to control systems based on models – i.e. changing the behaviour of a system means changing the models that control it.
  • Those who normally govern how work happens ‘below the floor’ cede control to the business over certain aspects of functionality and governance. It also means developing the right skills and governance capabilities within the business itself. Incidentally, DMN itself can help firms unpick the control challenge – providing a way of modelling how to make those governance decisions.

Notes

[i] These two paragraphs are largely culled from the early sections of the DMN specification itself. http://www.omg.org/spec/DMN/1.1/

[ii] See Bruce Silver’s excellent introduction to DMN and FEEL http://methodandstyle.com/dmn-decision-modeling-language/ (as of July 2016), as well as his book “DMN Method and Style: The Practitioner’s Guide to Decision Modeling with Business Rules” https://www.amazon.co.uk/d/Books/DMN-Method-Style-Practitioners-Decision/0982368151

[iii] Presentation by Silvie Spreeuwenberg at Decision Camp 17 and paper available at http://ceur-ws.org/Vol-1875/paper8.pdf

[iv] Agile software development hasn’t changed that overall approach much, apart from enabling it to happen faster and more iteratively.

[v] For the purposes of our analogy, it is irrelevant whether these systems run in house, or in the cloud.

[vi] Think of this as finally delivering on the promise of “Unbundling the Corporation” as predicted by John Hagel and Marc Singer in 1999. https://hbr.org/1999/03/unbundling-the-corporation.

[vii] These policies tend to change suddenly – in line with changes in the political landscape, public opinion, media interest and finances available. Moreover, the complexity associated with precedence of one policy over another makes it virtually impossible for any human to maintain an up to date view of the logic – no matter how experienced. For example, while your income might suggest that you are ineligible for a house assistance benefit, your recent circumstances – say your house just burnt down, may override that initial determination of ineligibility. The sheer complexity of these sorts of regulations and exceptions often leads to the use of subjective judgement, rather than the fair implementation of the law.

[viii] For example, one insurance organisation with robust execution capabilities in claims processing has opened up its operations to brokers, using DMN to enable them to design and price products, leveraging the componentized infrastructure of the firm. Similar things are happening in the telco industry where new virtual network operators are emerging to leverage the infrastructure of a larger player. Or think of Amazon – by industrialising and service enabling its technology infrastructure, Amazon now offers this capability to the market and acts as a customer itself, enabling a new rapidly-growing, and profitable, revenue stream.

[ix] This is one of the reasons that BPMN had such mixed success – it was never truly portable before BPMN 2.0.

[x] Both products have integrated other modelling approaches also, but here we focus especially on the decision modelling capabilities.

[xi] See end-to-end DMN demonstration as shown first at bpmNEXT in April 2017. This excellent short primer introduces several important aspects of the standard https://www.youtube.com/watch?tv=JZp8EgKoXzg. The execution aspects start at around 15 mins in.

[xii] For other vendors interested in implementing DMN level 3 (which includes FEEL support), the Red Hat implementation contains all the code necessary. As a result, it should be possible for third party vendors to implement FEEL support without too much problem.

[xiii] While all organisations are looking for ways of saving money, large enterprise users want to hear that support will also be available. For Red Hat, this increases their ability to challenge existing rules vendors such as IBM.

[xiv] See the way in which DMN has increasingly featured in the release history of Flowable (the process engine of Edorasware). https://github.com/flowable/flowable-engine/releases

[xv] Similar to the way in which BPMN developed, incumbent vendors – with established markets for their proprietary approaches – tend to try endorse the standard but stop short of embracing the full the promise of the specification. In the case of DMN, that means modelling Decision Tables but then stopping short of embracing the FEEL language, preferring their own proprietary execution language.

[xvi] See Keith Swenson’s introduction at the bpmNEXT event https://www.youtube.com/watch?v=M8goCq72lbo

[xvii] https://github.com/dmn-tck/tck

Download PDF version
RESEARCH REPORT // FREE