Enabling Operational Excellence
Enabling Operational Excellence
Enabling Operational Excellence
Enabling Operational Excellence

TURNING OPERATIONAL KNOWLEDGE & COMPLIANCE INTO A COMPETITIVE EDGE

We systemize tacit knowledge into explicit knowledge

Blog Enabling Operational Excellence

The Procedural Paradigm Won’t Scale: We Need Configuration Agility!

It’s been said that I claim the procedural paradigm won’t scale anymore. Guilty as charged! Let me explain. Procedural vs. Declarative In the big scheme of things, you have two basic choices for conceptualization, and ultimately implementation, of business capabilities: procedural vs. declarative. Let’s make sure we agree on what these terms mean. I’ll draw directly on Merriam-Webster Unabridged to make sure we’re on the same page. If the terms don’t mean what they’re supposed to mean, all bets are off. But I guess that goes without saying, doesn’t it?

procedure: 1a: a particular way of doing or of going about the accomplishment of something 1b (1): a particular course of action (2): a particular step adopted for doing or accomplishing something (3): a series of steps followed in a regular orderly definite way

You can spot the seeds of the scalability problem right away with repeated use of the word “particular” and with the phrase “regular orderly definite way” in the definition. Given the degree of product/service customization desired today, and the accelerating rate of change, how much business activity still occurs in a particular and regular orderly definite way? The answer, of course, is less and less all the time. ‘Exceptions’ have become the rule. The essential characteristic of procedures is that they flow. The flow comprises the steps by which a thing is intended to become a different thing (or the same thing in a different state). The essence of ‘procedure’ is therefore that something will be hopefully transformed. For sure, that’s a very basic, very important, very necessary part of any business capability. The problem arises taking procedure beyond that point. Something declarative, in contrast, doesn’t flow. It just states something that must (or should) be true.

declarative: 2: having the characteristics of or making a declaration : ASSERTIVE;  specifically : constituting a statement that can be either true or false

Business rules are that way; they simply give guidance. They don’t do anything. They don’t flow anywhere. They can’t be anything other than true or false. In short, business rules are fundamentally different than procedures. Big-P Process The traditional procedural paradigm (I’ll call it Big-P Process) embeds business rules in procedures (and in process models and in procedural code).What happens when you treat things that could be declarative in a procedural way? You get bloat. You lose business intent. You produce needless complexity. And you also get what I call configuration stagnation. As you scale, these problems grow exponentially. How many business rules are we talking about? Any given business capability easily has hundreds, sometimes thousands of business rules – especially when you begin to factor in the know-how needed to make smart operational business decisions. And don’t our businesses increasingly depend on ever more complex know-how? Is there any end to that trend in sight? At the scale of today’s business, the Big-P Process paradigm simply doesn’t work. It results in ungovernable business operations and unretainable know-how. Big-P solutions are like setting the business in concrete. It’s all so unnecessary and so counterproductive. It’s just not smart. Configuration Agility The key question for agile business capabilities is how the business is configured (and quickly reconfigured) for operation at any given point in time. In the Big-P paradigm, the building-blocks become thoroughly entangled with flow (procedure). The result is essentially a semantic dead zone. Because things that could be expressed declaratively aren’t, the opportunity is lost to use logic to automatically evaluate business rules (read ‘business practices’) for conflicts, anomalies and other logical defects. The future clearly does not lie in that direction. Instead, it lies with granular, declarative, semantically-rich specification of business configurations in building-block fashion. It lies with the paradigm that can produce the optimal configuration agility. In addition to procedures, smart configuration models will feature at least these other building blocks for business capabilities, all specified at the business level:
  • business rules 
  • operational business decisions 
  • structured business vocabularies (concept models, also known as fact models) 
  • business goals and risks 
  • business events
From an engineering perspective, the secret to agile configuration is ‘late binding’ – that is, bringing all the pieces together for execution (i.e., performance of procedures) as late as possible. That way, performance can be as up-to-date and as flexible as possible. Smart configuration models should be the new mantra for enterprise architecture. In a world of constant and accelerating change, I simply see no alternative. Doing more of the same is simply not going to work anymore (and already hasn’t been for a good, long while). [Warning, plug coming]: Smart configuration schemes also address business governance and compliance – essential in a world of constant change – and just-in-time (JIT) delivery of know-how for operational workers. In our new book, Building Business Capabilities (see http://www.brsolutions.com/b_building_business_solutions.php) we call systems built using smart configuration models business operation systems (as opposed to ‘information systems’).  

Tags: , , , , , , , , , ,

Ronald G. Ross

Ronald G. Ross

Ron Ross, Principal and Co-Founder of Business Rules Solutions, LLC, is internationally acknowledged as the “father of business rules.” Recognizing early on the importance of independently managed business rules for business operations and architecture, he has pioneered innovative techniques and standards since the mid-1980s. He wrote the industry’s first book on business rules in 1994.

Comments (2)

  • Stephen F. Heffner

    |

    Late binding (delayed evaluation) — there are several scenarios in which this
    comes into play.

    a) As Ron mentions in the article, late binding allows the use of the most
    up-to-date data available. This is basically a “just in time” scenario.

    b) Business rule B may need information that rule A generates, so B can’t be
    evaluated (fired) until A is, so evaluation (binding) of B must wait on A.
    This is a simple prerequisite scenario.

    c) Evaluation of rule A must precede rule B, but A generates algorithmic or
    sub-rule based results that can’t be fully bound until information is available
    from the evaluation of B. So A passes unbound results in symbolic form to B,
    which forces their evaluation. This is the most complicated scenario, because
    it involves partial evaluation of A, with the remainder being passed to B for
    it to evaluate after it has generated the information needed to do so.

    I’m intimately familiar with late binding, because the rules language for our
    computer language expert system provides a lot of control over when things are
    bound (evaluated), including very granular provisions for partial evaluation,
    protection from evaluation, and forcing of evaluation. This turns out to be
    very useful for scenarios like c) above, which occur pretty frequently when
    symbolically processing computer language content.

    Procedural vs. declarative — the flaw in Ron’s argument against procedural
    specification of rules is in his statement that “business rules…can’t be
    anything other than true or false”. Some business processes are, by definition,
    a sequence of steps, requiring a _procedure_ (sequence of steps) to be
    performed in order to execute them. I agree that when the declarative approach
    can be used, it’s advantageous to do so. But I also think a procedural
    approach, when it is necessary, can scale.

    Ron’s call for “just in time” and event-driven business rules is nothing new.
    Real-time systems have had to handle this for a long time, including complex
    “business rules” (procedures) that are event driven. Such systems have scaled
    to large quantities of data very successfully, with careful tuning. Examples
    can be found in the realm of operating systems.

    Another way to look at this is to contrast procedure-driven vs. data-driven
    algorithms. Data-driven algorithms, where possible, are almost always more
    efficient and easier to maintain than procedure-driven ones (although they
    demand a higher level of skill and experience to deal with them). But there is
    a grey area here, because a table may specify a procedure (series of steps)
    because of inherent sequentiality in the definition of the table.

    I think the most promising theme in Ron’s article is the idea of
    self-configuring business rules, which can adapt to change and effectively
    optimize themselves in response. But I don’t see that a procedural approach,
    when it’s appropriate, is an impediment to that approach. In fact, the
    adaptation and optimization logic itself is likely to be quite procedural. The
    saving grace is that if it’s sufficiently abstracted, its procedurality need
    not imply either a failure to scale or difficulty in maintaining and enhancing
    it.

    I had to think pretty carefully about the trade-off between the procedural and
    declarative approaches when I designed the rules language for our computer
    language expert system. The result is a language that’s a hybrid of both,
    which turns out to work very well. Of course, the rules written in that
    language aren’t exactly “business rules”, since they direct the manipulation of
    computer languages rather than the more straightforward data a typical
    enterprise must manipulate, but I’m not convinced that’s enough of a
    distinction to invalidate the success of the design I chose. And it scales
    very well, handling large quantities of computer code in reasonable time, even
    when using complex rules.

Comments are closed