Ontologies as Productive Assets

Preamble

An often overlooked benefit of artificial intelligence has been a renewed interest in seminal philosophical and cognitive topics; ontologies coming top of the list.

The Thinker Monkey, Breviary of Mary of Savoy
The Thinker Monkey, Breviary of Mary of Savoy

Yet that interest has often been led astray by misguided perspectives, in particular:

  • Universality: one-fits-all approaches are pointless if not self-defeating considering that ontologies are meant to target specific domains of concerns.
  • Implementation: the focus is usually put on representation schemes (commonly known as Resource Description Frameworks, or RDFs), instead of the nature of targeted knowledge and the associated cognitive capabilities.

Those misconceptions, often combined, may explain the limited practical inroads of ontologies. Conversely, they also point to ontologies’ wherewithal for enterprises immersed into boundless and fluctuating knowledge-driven business environments.

Ontologies as Assets

Whatever the name of the matter (data, information or knowledge), there isn’t much argument about its primacy for business competitiveness; insofar as enterprises are concerned knowledge is recognized as a key asset, as valuable if not more than financial ones, and should be managed accordingly. Pushing the comparison still further, data would be likened to liquidity, information to fixed income investment, and knowledge to capital ventures. To summarize, assets whatever their nature lose value when left asleep and bear fruits when kept awake; that’s doubly the case for data and information:

  • Digitized business flows accelerates data obsolescence and makes it continuous.
  • Shifting and porous enterprises boundaries and markets segments call for constant updates and adjustments of enterprise information models.

But assessing the business value of knowledge has always been a matter of intuition rather than accounting, even when it can be patented; and most of knowledge shapes up well beyond regulatory reach. Nonetheless, knowledge is not manna from heaven but the outcome of information processing, so assessing the capabilities of such processes could help.

Admittedly, traditional modeling methods are too stringent for that purpose, and looser schemes are needed to accommodate the open range of business contexts and concerns; as already expounded, that’s precisely what ontologies are meant to do, e.g:

  • Systems modeling,  with a focus on integration, e.g Zachman Framework.
  • Classifications, with a focus on range, e.g Dewey Decimal System.
  • Conceptual models, with a focus on understanding, e.g legislation.
  • Knowledge management, with a focus on reasoning, e.g semantic web.

And ontologies can do more than bringing under a single roof the whole of enterprise knowledge representations: they can also be used to nurture and crossbreed symbolic assets and develop innovative ones.

Ontologies Benefits

Knowledge is best understood as information put to use; accounting rules may be disputed but there is no argument about the benefits of a canny combination of information, circumstances, and purpose. Nonetheless, assessing knowledge returns is hampered by the lack of traceability: if a part of knowledge is explicit and subject to symbolic representation, another is implicit and manifests itself only through actual behaviors. At philosophical level it’s the line drawn by Wittgenstein: “The limits of my language mean the limits of my world”;  at technical level it’s AI’s two-lanes approach: symbolic rule-based engines vs non symbolic neural networks; at corporate level implicit knowledge is seen as some unaccounted for aspect of intangible assets when not simply blended into corporate culture. With knowledge becoming a primary success factor, a more reasoned approach of its processing is clearly needed.

To begin with, symbolic knowledge can be plied by logic, which, quoting Wittgenstein again, “takes care of itself; all we have to do is to look and see how it does it.” That would be true on two conditions:

  • Domains are to be well circumscribed. 
  • A water-tight partition must be secured between the logic of representations and the semantics of domains.

That could be achieved with modular and specific ontologies built on a clear distinction between common representation syntax and specific domains semantics.

As for non-symbolic knowledge, its processing has for long been overshadowed by the preeminence of symbolic rule-based schemes, that is until neural networks got the edge and deep learning overturned the playground. In a few years’ time practically unlimited access to raw data and the exponential growth in computing power have opened the door to massive sources of unexplored knowledge which is paradoxically both directly relevant yet devoid of immediate meaning:

  • Relevance: mined raw data is supposed to reflect the geology and dynamics of targeted markets.
  • Meaning: the main value of that knowledge rests on its implicit nature; applying existing semantics would add little to existing knowledge.

Assuming that deep learning can transmute raw base metals into knowledge gold, enterprises would need to understand, assess, and improve the refining machinery. That could be done with ontological frames.

A Proof of Concept

Compared to tangible assets knowledge may appear as very elusive, yet, and contrary to intangible ones, knowledge is best understood as the outcome of processes that can be properly designed, assessed, and improved. And that can be achieved with profiled ontologies.

As a Proof of Concept, an ontological kernel has been developed along two principles:

  • A clear-cut distinction between truth-preserving representation and domain specific semantics.
  • Profiled ontologies designed according to the nature of contents (concepts, documents, or artifacts), layers (environment, enterprise, systems, platforms), and contexts (institutional, professional, corporate, social.

That provides for a seamless integration of information processing, from data mining to knowledge management and decision making:

  • Data is first captured through aspects.
  • Categories are used to process data into information on one hand, design production systems on the other hand.
  • Concepts serve as bridges to knowledgeable information.

CaKe_DataInfoKnow

A beta version is available for comments on the Stanford/Protégé portal with the link: Caminao Ontological Kernel (CaKe).

Further Reading

External Links

Focus: Business Analyst Booklet

Objective

Business analysts stand between unbounded and moving business landscapes on one hand, distinctive and steady enterprise organization and culture on the other hand.

How to align enterprise resources and business opportunities (Patrick Zachmann)

Assuming that BAs’ primary concern is to keep ahead of the competition, framing business undertakings into universal guidelines could be counterproductive. By contrast, harnessing together versatile business processes and reliable systems architectures will clearly enhance business agility; hence the benefits of lining up enterprise architects’ and business analysts’ conceptual toolboxes:

  1. Concepts : eight exclusive and unambiguous definitions provide the conceptual building blocks.
  2. Models: how the concepts are used to consolidate business requirements and convey them to enterprise architects and software engineers.
  3. Processes: how to harness organization and business objectives and align applications with business value.
  4. Architectures: how to contrive along time the continuity and consistency of business concepts and objectives, and their congruence with systems capabilities.
  5. Governance: assessment of business value and risks.

On that basis, the objective here is not to detail BAs’ tasks or methods but to focus on core issues to be addressed by business analysts.

Concepts

Whereas systems architecture is not their primary concern, business analysts should nonetheless share the same modeling paradigm:

  • Analysis models for business environments and objectives.
  • Design models for the architecture of systems and the specification of components.
Business objects and processes must be consistently identified (#) across business and system realms.

It is worth to remind that the distinction between descriptive (aka analysis) and prescriptive (aka design) models is not arbitrary but based on logic principles: the former are extensional as they classify actual instances of business objects and activities; in contrast, the latter are intensional as they define the features and behaviors of required system artifacts.

The distinction also brings organizational benefits as it tallies with BAs’ responsibility regarding the consistency and continuity of identities and semantics of actual objects and processes (business extensions) and their symbolic counterparts (system intensions):

Relevant categories at architecture level can be neatly and unambiguously defined.
  • Actual containers represent address spaces or time frames; symbolic ones represent authorities governing symbolic representations. System are actual realizations of symbolic containers managing symbolic artifacts.
  • Actual objects (passive or active) have physical identities; symbolic objects have social identities; messages are symbolic objects identified within communications. Power-types (²) are used to partition objects.
  • Roles (aka actors) are parts played by active entities (people, devices, or other systems) in activities (BPM), or, if it’s the case, when interacting with systems (UML’s actors). Not to be confounded with agents meant to be identified independently of their behavior.
  • Events are changes in the state of business objects, processes, or expectations.
  • Activities are symbolic descriptions of operations and flows (data and control) independently of supporting systems; execution states (aka modes) are operational descriptions of activities with regard to processes’ control and execution. Power-types (²) are used to partition execution paths.

https://caminao.blog/wp-content/uploads/2019/10/voa_org.png?w=552

While business analysts should only be tasked with the continuous and consistent mapping of business individuals to their system surrogates, and not with their implementations, that cannot be achieved without a full and unambiguous specification of the variants and abstractions for the business objects and processes to be represented.

Languages & Models

Being in charge of requirements, business analysts can be seen as the gate-keepers of the whole engineering process. To begin with, and depending on the nature of domains, BAs can capture requirements using formal (e.g for scientific domains), specific, or natural languages. Then, requirements analysis can be carried out:

  • Iteratively in unison with development and in collaboration with software engineers (agile approach). In that case models are not necessary as requirements are expressed in natural language (users’ stories), possibly combined with domain specific languages (DSLs) for development.
  • As phased undertakings carried out independently, using a dedicated modeling language (e.g BPMN).
  • As phased undertakings carried out jointly with system analysts using a general purpose modeling language (e.g UML).
Three ways to deal with requirements analysis: business oriented and phased (BPMN), system oriented and phased (use cases), or business driven and iterative (users’ stories).

These schemes are therefore best understood as tools whose employ may overlap or be combined:

  • BPMN and UML activity diagrams have much in common.
  • Class diagram can complement BPMN for business objects, and State diagrams for processes control.
  • Use cases can be seen as describing the part of users’ stories to be supported by systems.

How BAs will employ them is to depend on business processes and projects’ objectives.

Business & Development Processes

The responsibility of BAs is about business processes, the choice of development model being left to project managers; hence the need for business analysts to be familiar with basic options:

  • Agile: business analysts collaborate with software engineers in project teams and share responsibilities from requirements to delivery.
  • Phased: roles and responsibilities are defined specifically with regard to development tasks.
With agile schemes BAs share roles and responsibilities all along, with phased ones roles and responsibilities are defined with regard to tasks.

Agile or phased, the contribution of business analysts can be defined around three core issues, corresponding to three typical modus operandi:

  • Concepts associated to business objects and activities that are to be represented. Assuming that conceptual models are meant to be stable and shared across processes, they should be under the responsibility of business analysts independently of applications.
  • Actors (users, devices, or systems) and activities. Insofar as the impact on organization and system functional features can be localized (users interfaces) or circumscribed (business rules), business analysts can collaborate and share responsibility with software engineers all along an iterative process. Otherwise (changes in organization or business functions) business analysts will have to consolidate their work with enterprise architects.
  • Processes execution. Often labelled as non functional capabilities, they essentially deal with the different aspects of user’s experience and the synchronization of changes in business environments and supporting systems. For that purpose business analysts will have to check requirements against systems capabilities.
Business analysts core concerns and MO: conceptual model, activities, and processes.

While these issues are often interwoven, sorting them out can help to match development models with projects objectives and scope: agile for projects facing business users, phased for the ones dealing with architectures; that will also help to characterize the role of BAs depending on focus: business processes (BPM, use cases, users’ stories), functional architecture (services, conceptual models), or quality of services.

Business Analysis & Systems Architectures

When considering business opportunities, business analysts have to define requirements’ footprint with regard to system capabilities:

  • Confined: applications can be developed in collaboration with software engineers from users’ stories to code, without modeling. Assuming agile conditions about shared ownership and continuous delivery are met, that would be the default option.
  • Distributed: some modeling is needed for communication and consolidation purposes. But business processes modeling languages like BPMN make no distinction between processes’ details and the shared features of supporting systems. That puts a challenging toll on business analysts (complexity, ambiguity) with limited benefits (no easy mapping to system functions).

A primary concern for business analysts should therefore to frame projects accordingly: self-contained and business driven on one hand, shared and architecture driven on the other hand, with use cases set in between if and when necessary. For that purpose shared concerns will have to be clearly identified; taking BPMN for example:

BPEA_ArchiProc
Separation of concerns: architecture backbone and processes’ details
  • Containers for physical (locations) and logical (organizations and domains) objects have no BPMN explicit equivalents.
  • Active objects have no BPMN explicit equivalent.
  • Swimlanes and pool tally with roles (aka actors)
  • Data stores tally with entities (persistent representation of business objects).
  • Tasks, transactions, and sub-processes can be translated as activities description and processes execution.

Given backbones shared with enterprise architects, the next step is to flesh them out with specific details. Depending on methods and tools, that can be done using a domain specific language (DSL) with direct implementation, or through a generic subset of BPMN that could be unambiguously mapped to design constructs, for instance:

  • Anchors (#): instances (objects or activities) directly and consistently identified across businesses and system.
  • Collections (*): set of individuals with shared features.
  • Features: attributes or operations without identity of their own.
  • Structures (diamond): composition (black) for individual components (objects or activities) whose life-cycle is bound to their owner, i.e they have no identity of their own; aggregation (white) for components identified independently but used in the context of their owner.
  • Connectors: associate individuals; their semantics is set by context: communication channel, reference, data or control flow, transition. They can bear identification (#).
  • Power-types (2): define subsets of individuals objects or activities. Depending on context and modeling language, power-types correspond to classifications, extension points, gateways, branch and joins, etc.
  • Inheritance (triangle): contrary to structure and functional connectors that deal with instances, inheritance connectors are used to describe relationships between descriptors. Strong inheritance (black) is the counterpart of composition (inheritance of structural features), and weak inheritance (white) the counterpart of aggregation (inheritance functional features).
Separation of concerns: architecture backbone and anchors details

Using the same set of well accepted and unambiguous logical constructs for both objects and behaviors can greatly enhance the consistency of analysis models as well as their traceability to designs.

Business Analysis & Knowledge Management

As noted above, while business analysts may have to consolidate functional requirements or check the feasibility of non functional ones with enterprise architects, they should take responsibility for conceptual models, and more generally for enterprise knowledge architecture. Taking a leaf from Davis, Shrobe, and Szolovits, that will cover:

  1. Surrogates: description of symbolic counterparts (aka) of actual objects, events and relationships.
  2. Ontological commitments: statements about the categories of things that may exist in the domain under consideration.
  3. Fragmentary theory of intelligent reasoning: model of what the things can do or can be done with.
  4. Medium for efficient computation: knowledge understandable by computers.
  5. Medium for human expression: communication between specific domain experts on one hand, generic knowledge managers on the other hand.

Putting apart users interfaces (point 5), two typical approaches can be considered:

  • Domain Driven Design (DDD), which deals with domains representation and computation from a system perspective (point 4).
  • Ontologies, which put the focus on knowledge oriented languages independently of computation (points 1-3).

Besides their simplex orientation, both fall short of business analysts needs, the former being too technical, the latter too open-ended. Instead, a conceptual framework should combine bounded domains with a compact and unambiguous knowledge oriented language.

As it happens, mapping the symbolic footprint of business domains and knowledge into systems may be dictated by the generalization of networked environments and digital business flows. Along that reasoning, BAs will have to deal with knowledge from domains as well process perspectives.

With regard to domains, a distinction should be maintained between institutional (external, statutory), business specific (external, agreed), and enterprise specific (internal).

Domains_BOCs
A conceptual approach to domain layers: institutional, business specific (e.g HR management) and enterprise specific (e.g supply, sales).

With regard to processes, knowledge must be understood as the dynamic and multi-faceted outcome of data analytics, production systems, and decision-making. Taking a (revised) leaf of Zachman’s framework, business and operational objectives would be reset as to cross architecture layers instead of being aligned. Using a pentagonal representation of enterprise architecture, Zachman’s sixth column (“Why” ) would be rounded as an outer range.

https://caminao.blog/wp-content/uploads/2019/10/voa_val.png?w=520

Along that perspective embedding IT systems in business processes is to become a key success factor, which is to bring business intelligence up on the list of business analysts’ concerns.

Business Intelligence

If business intelligence is to take into account the ubiquity of digitized business processes and the integration of enterprises with their environments, a seamless integration of data analytics and decision-making is to be a primary concern for BAs.

Data analytics (sometimes known as data mining) is best understood as a refining activity whose purpose is to process raw data into meaningful information:

  • Data understanding gives form and semantics to raw material.
  • Business understanding charts business contexts and concerns in terms of objects and processes descriptions.
  • Modeling consolidates data and business understanding into descriptive, predictive, or operational models.
  • Evaluation assesses and improves accuracy and effectiveness with regard to objectives and decision-making.

Decision-making processes in open and digitized environment are best described with the well established OODA (Observation, Orientation, Decision, Action) loop:

  1. Observation: understanding of changes in business environments (aka territories).
  2. Orientation: assessment of the reliability and shelf-life of pertaining information (aka maps) with regard to current positions and operations.
  3. Decision: weighting of options with regard to enterprise capabilities and broader objectives.
  4. Action: carrying out of decisions within the relevant time-frame.

The integration of data analytics and decision-making would be a key benefit of enterprise architecture.

OKBI_BIDM
Seamless integration of data analytics and decision-making.

On a broader perspective data analytics and decision-making can be seen as the front-offices of business intelligence, and  knowledge management as its back-office. That organization can be reinforced with ontologies set with regard to governance and stability of contexts:

  • Institutional: Regulatory authority, steady, changes subject to established procedures.
  • Professional: Agreed upon between parties, steady, changes subject to accords.
  • Corporate: Defined by enterprises, changes subject to internal decision-making.
  • Social: Defined by usage, volatile, continuous and informal changes.
  • Personal: Customary, defined by named individuals (e.g research paper).

Ontologies set along that taxonomy of contexts could also be refined as to be aligned with enterprise architecture layers: enterprise, systems, platforms, e.g:

Ontologies, capabilities (Who,What,How, Where, When), and architectures (enterprise, systems, platforms).

Crossing business taxonomies with enterprise architecture capabilities could significantly improve the integration of business processes and supporting systems.

Governance: Metrics, Quality, & Risks

As gate-keepers, business analysts have to rank projects with regard to business value, risks, and return on investment. Assuming that business value is set independently of supporting systems, projects’ assessment and ranking should be set according to the nature of problems:

  • Intrinsic business size and complexity: requirements can be estimated from individuals (objects and activities), features, relationships, and partitions.
  • Supporting systems functionalities: intrinsic business metrics are to be combined with what is expected from supporting systems: processes and transactions, triggering events, users and devices interfaces, etc.
  • Business and functional measurements can then be weighted by non-functional (aka Quality of Service) requirements.
Assessment should be aligned with problems: business, supporting systems, operations.

If returns on investment (ROI) and risks are to be assessed consistently and decision-making carried out accordingly, value, costs, quality, and hazards have to be set within the same framework, in particular for quality and risks management:

  • Business environment: risks are external and quality is to check for timely and relevant analysis models.
  • Engineering:  risks are internal and quality is to focus on processes maturity.
  • Technologies: risks are external and quality is to address versatility, plasticity, and effectiveness of solutions.

To conclude, whereas business risks remain the primary concern of business analysts, the fusion of business and systems processes means that they can no longer ignore engineering pitfalls and the importance of quality for risks management.

Further Reading

Focus: Business Cases for Use Cases

Preamble

As originally defined by Ivar Jacobson, uses cases (UCs) are focused on the interactions between users and systems. The question is how to associate UC requirements, by nature local, concrete, and changing, with broader business objectives set along different time-frames.

Sigmar-Polke-Hope-Clouds
Cases, Kites, and Clouds (Sigmar Polke)

Backing Use Cases

On the system side UCs can be neatly traced through the other UML diagrams for classes, activities, sequence, and states. The task is more challenging on the business side due to the diversity of concerns to be defined with other languages like Business Process Modeling Notation (BPMN).

Use cases at the hub of UML diagrams
Use Cases contexts

Broadly speaking, tracing use cases to their business environments have been undertaken with two approaches:

  • Differentiated use cases, as epitomized by Alister Cockburn’s seminal book (Readings).
  • Business use cases, to be introduced beside standard (often renamed as “system”) use cases.

As it appears, whereas Cockburn stays with UCs as defined by Jacobson but refines them to deal specifically with generalization, scaling, and extension, the second approach introduces a somewhat ill-defined concept without setting apart the different concerns.

Differentiated Use Cases

Being neatly defined by purposes (aka goals), Cockburn’s levels provide a good starting point:

  • Users: sea level (blue).
  • Summary: sky, cloud and kite (white).
  • Functions: underwater, fish and clam (indigo).

As such they can be associated with specific concerns:

Cockburn’s differentiated use cases
  • Blue level UCs are concrete; that’s where interactions are identified with regard to actual agents, place, and time.
  • White level UCs are abstract and cannot be instanciated; cloud ones are shared across business processes, kite ones are specific.
  • Indigo level UCs are concrete but not necessarily the primary source of instanciation; fish ones may or may not be associated with business functions supported by systems (grey), e.g services , clam ones are supposed to be directly implemented by system operations.

As illustrated by the example below, use cases set at enterprise or business unit level can also be concrete:

Example with actors for users and legacy systems (bold arrows for primary interactions)

UC abstraction connectors can then be used to define higher business objectives.

Business “Use” Cases

Compared to Cockburn’s efficient (no new concept) and clear (qualitative distinctions) scheme, the business use case alternative adds to the complexity with a fuzzy new concept based on quantitative distinctions like abstraction levels (lower for use cases, higher for business use cases) or granularity (respectively fine- and coarse-grained).

At first sight, using scales instead of concepts may allow a seamless modeling with the same notations and tools; but arguing for unified modeling goes against the introduction of a new concept. More critically, that seamless approach seems to overlook the semantic gap between business and system modeling languages. Instead of three-lane blacktops set along differentiated use cases, the alignment of business and system concerns is meant to be achieved through a medley of stereotypes, templates, and profiles supporting the transformation of BPMN models into UML ones.

But as far as business use cases are concerned, transformation schemes would come with serious drawbacks because the objective would not be to generate use cases from their business parent but to dynamically maintain and align business and users concerns. That brings back the question of the purpose of business use cases:

  • Are BUCs targeting business logic ? that would be redundant because mapping business rules with applications can already be achieved through UML or BPMN diagrams.
  • Are BUCs targeting business objectives ? but without a conceptual definition of “high levels” BUCs are to remain nondescript practices. As for the “lower levels” of business objectives, users’ stories already offer a better defined and accepted solution.

If that makes the concept of BUC irrelevant as well as confusing, the underlying issue of anchoring UCs to broader business objectives still remains.

Conclusion: Business Case for Use Cases

With the purposes clearly identified, the debate about BUC appears as a diversion: the key issue is to set apart stable long-term business objectives from short-term opportunistic users’ stories or use cases. So, instead of blurring the semantics of interactions by adding a business qualifier to the concept of use case, “business cases” would be better documented with the standard UC constructs for abstraction. Taking Cockburn’s example:

Abstract use cases: no actor (19), no trigger (20), no execution (21)

Different levels of abstraction can be combined, e.g:

  • Business rules at enterprise level: “Handle Claim” (19) is focused on claims independently of actual use cases.
  • Interactions at process level: “Handle Claim” (21) is focused on interactions with Customer independently of claims’ details.

Broader enterprise and business considerations can then be documented depending on scope.

Further Reading

External Links

Business Agility & the OODA Loop

Preamble

The OODA (Observation, Orientation, Decision, Action) loop is a real-time decision-making paradigm developed in the sixties by Colonel John Boyd from his experience as fighter pilot and military strategist.

moholy nagy spheres1
How to get inside opponent’s loop (Lazlo Moholy-Nagy)

The relevancy of OODA for today’s operational decision-making comes from the seamless integration of IT systems with business operations and the resulting merits of agile development processes.

Business: End of Discrete Time-Frames

Business governance was used to be phased: analyze the market, select opportunities, build capabilities, launch operations. No more. With the melting of the fences between actual and symbolic realms, periodic transitional events have lost most of their relevancy. Deprived of discrete and robust time-frames, the weaving of observed facts with business plans has to be managed on the fly. Success now comes from continuous readiness, quicker tempo, and the ability to operate inside adversaries’ time-scales, for defense (force competitor out of favorable position) as well as offense (get a competitive edge). Hence the reference to dogfights.

Dogfights & Agile Primacy

John Boyd train of thoughts started with the observation that, despite the apparent superiority of the soviet Mig 15 on US F-86 during the Korea war, US fighters stood their ground. From that factual observation it took Boyd’s comprehensive engineering work to demonstrate that as far as dogfights were concerned fast transients between maneuvers (aka agility) was more important than technical capabilities. Pushed up Pentagon’s reluctant ladders by Boyd’s sturdy determination, that conclusion have had wide-ranging consequences in the design of USAF fighters and pilots formation for the following generations. Its influence also spread to management, even if theories’ turnover is much faster there, and shelf-life much shorter.

Nowadays, with the accelerated integration of business processes with IT systems, agility is making a comeback from the software engineering corner. Reflecting business and IT convergence, principles like iterative development, just-in-time delivery, and lean processes, all epitomized by the agile software development model, are progressively mingling into business practices with strong resemblances to dogfights; and the resemblances are not only symbolic.

IT Systems & Business Competition

While some similarities between dogfights and business competition may seem metaphorical, one critical aspect is all too real, namely the increasing importance of supporting machines, IT systems or fighter jets.

Basically, IT systems, like fighters’ electronics, are tasked to observe environments, analyse changes in relation to position and objectives, and support decision-making. But today’s systems go further with two qualitative leaps:

  • The seamless integration of physical and symbolic flows let systems manage some overlapping between supporting decisions and carrying out actions.
  • Due to their artificial intelligence capabilities, systems can learn on-the-job and improve their performances in real-time feedback loops.

When combined, these two trends have drastic impact on the way machines can support human activities in real-time competitive situations. More to the point, they bring new light on business agility.

Business Agility

As illustrated by the radical transformation of fighter cockpits, the merging of analog and digital flows leaves little room for human mediation: data must be processed into information and presented instantly along two critical dimensions, one for decision-making, the other for information life-cycle:

  • Man/Machine interfaces have to materialize the merging of actual and symbolic realms as to support just-in-time decision-making.
  • The replacement of phased selected updates of environment data by continuous changes in raw and massive data means that the status of information has to be incorporated with the information itself, yet without impairing decision-making.

Beyond obvious differences between dogfights and business competition, that double exigence is to characterize business agility:

  1. Observation: understanding the nature, origin, and time-frame of changes in business environments (aka territories).
  2. Orientation: assessment of the reliability and shelf-life of pertaining information (aka maps) with regard to stakes and current positions and operations.
  3. Decision: weighting of options with regard to enterprise stakes and capabilities.
  4. Action: carrying out of decisions according to stakes and time-frames.

That understanding of business agility is to be compared with its development and architecture cousins. Yet it doesn’t seem to add much to data analytics and operational decision-making. That is until the concepts of observation and orientation are reassessed with regard to EA maps and territories.

Using OODA blueprint to integrate business intelligence and operational decision-making into enterprise architecture.

Such integration is to become a primary success factor for enterprises immersed in digital environment.

Agility & Orientation: Task vs Tack

To begin with basics, the concept of Orientation comes with a twofold meaning, actual and symbolic:

  • Actual: a position with regard to external (e.g spacial) coordinates, possibly qualified with abilities to observe, move, or act.
  • Symbolic: a position with regard to internal (e.g beliefs or aims) references, possibly mixed with known or presumed orientation of other agents, opponents or associates.

When business is considered, data analytics is supposed to deal comprehensively and accurately with markets’ actual orientations. But the symbolic facet is left largely unexplored.

Boyd’s contribution is to bring together both aspects and combine them into actual practice, namely how to foretell the tack of your opponents from their actual tracks as well as their surmised plans, while fooling them about your own moves, actual or planned.

Such ambitions, once out of reach, can now be fulfilled due to the combination of big data, artificial intelligence, and the exponential growth on computing power.

Further Readings

Business Problems shouldn’t sleep with IT Solutions

Preamble

The often mentioned distinction between problem and solution levels may make sense from an analyst’s particular point of view, whether business or system.  But blending problems and solutions independently of their nature becomes a serious over simplification for enterprise architects considering that one of their prime responsibility is to keep apart business problems from IT solutions.

(Mircea Cantor)
Functional problem with technical solution (Mircea Cantor)

That issue is relevant from engineering as well as business perspective.

Engineering View: Problem Levels & Architecture Layers

As long as computers are used to solve problems the only concern is to find the best solution, and the only architecture of concern is software’s.

But enterprise architects have to deal with systems, not computers, namely how to best serve business objectives with corporate resources, across business units and along business cycles. For that purpose resources (financial, human, technical) and their use are to be layered according to the nature of problems and solutions: business processes (enterprise), supporting functionalities (systems), and technologies (platforms).

From an engineering perspective, the intended congruence between problems levels and architecture layers can be illustrated with the OMG’s model driven architecture (MDA) framework:

  • Computation independent models (CIMs) deal with business processes solutions, to be translated into functional problems for supporting systems.
  • Platform independent models (PIMs) deal with functional solutions, to be translated into technical problems for supporting platforms.
  • Platform specific models (PSMs) deal with technical solutions, to be implemented as code.

MDA layers correspond to a clear hierarchy of problems and solutions
MDA layers can be mapped to a clear hierarchy of problems and solutions

Along that understanding, architectures can be seen as solutions, and the primary responsibility of enterprise architects is to see that problems/solutions brace remain in their respective swim-lanes.

Business View: Business Value & Enterprise Assets

Whereas the engineering perspective may appear technical or specific to a model based approach, the same issue is all the more significant when expressed with regard to business concerns and corporate governance. In that case the critical distinction is between business value and assets:

  • Business value: Problems are set by business opportunities, and solutions by processes and applications. The critical factor is reactivity and time-to-market.
  • Assets: Problems are set by business objectives and strategy, and solutions are to be supported by organization and systems capabilities. The critical factor is reuse and ROI.

Decision-making must distinguish between business opportunities and enterprise governance
Decision-making must distinguish between business opportunities and enterprise governance

If opportunities are to be seized and operations managed on the fly  yet tally with strategic decisions, respective problems and solutions should be kept apart. Juggling with their dynamic alignment is at the core of enterprise architects’ job description.

Enterprise Architects & Governance

Engineering and business perspectives are not to be seen as the terms of an alternative to be picked by enterprise architects. As a matter of fact they must be crossed and governance policies selected depending on the point of view:

  • Looking at EA from an engineering perspective,  the business one will focus on systems governance and assets management as epitomized by model based systems engineering schemes.
  • Looking at EA from a business perspective, the engineering one will focus on lean and just-in-time solutions, as epitomized by agile development models.

As far as governance of large and complex corporate entities, supposedly EA’s primary target, must deal with tactical, operational, and strategic concerns, the nexus between business and engineering perspectives is where enterprise architects are to stand.

 

 

Focus: Business Processes & Abstraction

Preamble

Abstractions, and corollary inheritance, are primarily understood with objects. Yet, since business processes are meant to focus on activities, semantics may have to be refined when abstraction and inheritance are directly used for behaviors.

enrique_gimenez-velilla
How to apply abstraction to processes ?  (E. Gimenez Velilla)

Considering that the primary purpose of abstractions is to tackle business variants with regard to supporting systems, their representation with use cases provides a good starting point.

Business Variants: Use case’s <extend> & <include>

Taking use cases as a modeling nexus between business and systems realms, <extend> and <include> appear as the default candidates for the initial description of behaviors’ specialization and generalization.

  • <include>: to be compared to composition semantics, with the included behaviors performed  by instances identified (#) by the owner UC (a).
  • <extend>: to be compared to aggregation semantics, with the extending behaviors performed  by separate instances with reference to the owner ones (b).

Included UCs are meant to be triggered by owners (a); that cannot be clearly established for abstract use cases and generalization (c).
Included UCs are meant to be triggered by owners (a); that cannot be clearly established for abstract use cases and generalization (c).

Abstract use cases and generalization have also been mentioned by UML before being curiously overlooked in following versions. Since none has been explicitly discarded, some confusion remains about hypothetical semantics. Notionally, abstract UCs would represent behaviors never to be performed on their own (c). Compared to inclusion, used for variants of operations along execution paths, abstract use cases would describe the generic mechanisms to be applied to triggering events at UC inception independently of actual business operations carried out along execution paths.

Nonetheless, and more importantly, the mix-up surrounding the generalization of use cases points to a critical fault-line running under UML concepts: since both use cases and classes are defined as qualifiers, they are supposed to be similarly subject to generalization and specialization. That is misguided because use cases describe the business behaviors to be supported by systems, not to be confused with the software components that will do the job. The mapping between the former and the latter is to be set by design, and there is no reason to assume a full and direct correspondence between functional requirements and functional architecture.

Use Cases Distilled

As far as use cases are considered, mapping business behaviors to supporting systems functionalities can be carried out at two levels:

  • Objects: UCs being identified by triggering agents, events, and goals, they are to be matched with corresponding users interfaces and controllers, the former for the description of I/O flows, the latter for the continuity and integrity of interactions.
  • Methods: As it’s safe to assume that use cases are underpinned by shared business functions and system features, a significant part of their operations are to be realized by methods of shared business entities or services.

vv
Setting apart UIs and controllers, no direct mapping should be assumed between use cases and functional qualifiers.

The business variants distilled into objects’ or services’ methods can be generalized and specialized according to OOD principles; and the same principles can be applied to specific users’ interfaces. But since purely behavioral aspects of UCs can neither be distilled into objects’ methods, nor directly translated into controller objects, their abstraction semantics have to be reconsidered.

Inheritance Semantics: Structural vs Functional

As far as software artifacts are concerned, abstraction semantics are set by programming languages, and while they may differ, the object-oriented (OO) paradigm provides some good enough consolidation. Along that perspective, inheritance emerges as a critical issue due to its direct impact on the validity of programs.

Generally speaking, inheritance describes how structural or behavioral traits are passed from ancestors to descendants, either at individual or type level. OO design is more specific and puts the focus on the intrinsic features (attributes and operations) supported by types or classes, which ensues that behaviors are not considered as such but through the objects’ methods that realize them:

  • Structural inheritance deals with attributes and operations set for the whole life-cycle of instances. As a consequence corresponding inheritance is bound to identities (#) and multiple ascendants (i.e identities) are ruled out.
  • Functional inheritance deal with objects behaviors which may or may not be frozen to whole life-cycles. Features can therefore be inherited from multiple ascendants.

That structural vs functional distinction matches the one between composition and aggregation used to characterize the links between objects and parts which, as noted above, can also be applied to uses cases.

Use Cases & Abstraction

Assuming that the structural/functional distinction defined for objects can also be applied to behaviors, use cases provide a modeling path from variants in business processes to OOD of controllers:

  • Behaviors included by UCs (a) are to be set along the execution paths triggered by UC primary events (#). Inheritance is structural, from UCs base controllers to corresponding (local) ones, and covers features (e.g views on business objects) and associated states (e.g authorizations) defined by use case triggering circumstances.
  • Behaviors extending UCs (b) are triggered by secondary events generated along execution paths. Inheritance is functional, from extending UCs (e.g text messaging) to UCs primary controllers.

Yet this dual scheme may not be fully satisfactory as it suffers from two limitations:

  • It only considers the relationships between UCs, not with the characteristics of the use cases themselves.
  • It ignores the critical difference between the variants of business logic and the variants of triggering conditions.

Both flaws can be patched up if abstract use cases are specifically introduced to factor out triggering circumstances (c):

Use cases provide a principled modeling path from variants in business processes to the OOD of corresponding controllers.
Use cases provide a principled modeling path from variants in business processes to the OOD of corresponding controllers.

  • Undefined triggering circumstances is the only way to characterize abstraction independently of what happens along execution paths.
  • Abstract use cases can then be used to specify inception mechanisms to be inherited by concrete use cases.

That understanding of abstract use cases comes with clear benefits with regard to security and confidentiality.

What is at Stake

Abstraction can significantly reinforce the bridging role of use cases between business and UML models.

On one side specialized use cases can be associated to operations and functions directly implemented, e.g  by factoring out authentication and authorization:

One standard solution is to define a common use case controlling accesses for all users providing they can be identified before being subsequently (i.e during UC execution) qualified and authorized. Apparently, that could be done with <<include>> (a) or <<extend>> (b) connectors.

PtrnUC_abst

But the second option would not be possible with the semantic distinction suggested above for UC patterns, which specifies that use cases can only be extended from existing sessions.

A more generic approach (possibly with patterns) could try to “abstract” Open Session UC, e.g to cover a broader range of actors and identification mechanisms.

Understanding UC abstraction in terms of a partial specification to be <<included>> and run by the current thread will be inconsistent because there would be no concrete actor for the identification mechanisms (c).

By contrast, since inheritance connectors apply to types and not to instances (i.e execution threads), abstracted identification mechanisms are meant to be part of Manage Session and can be applied to triggering actors (d).

Such a clear distinction between the specification of threads (using connectors) and activities (using inheritance) should provide the basis of architecture-based UC patterns.

Al in all, that will greatly help to align business cases, business opportunities, and functional architectures.

Further Reading

 

Business Stories: Stakeholders’ Plots & Users’ Narratives

Preamble

As Aristotle noted some time ago, plots are the backbone of any story as they uphold the causal sequence of events and actions: they provide the “why” of what happens, compared to narratives, which tell “how” what happened is being told.

cccc
Only shadows will tell: as far as stories are concerned, possibilities remain unknown until their realization.

So, in principle, plots deal with possibilities and narratives with realizations. But in fact plots remain unknown until being narrated; in other words fictions are like Schrödinger’s cat: there is no way to set possibilities and realizations apart.

That literary conundrum may convey some useful clues for business analysis, with stakeholders objectives seen as plots, and users’ stories as narratives.

Stakeholders’ Plots vs Users’ Narratives

With regard to the functionalities of supporting systems, a key issue for business analysts is to accommodate specific and/or short-term opportunities identified by business units with broader and long-standing objectives defined at corporate level.

Using the fictional metaphor, business expectations can be charted in terms of plots and narratives:

  • Business objectives (as plots) are meant to apply continuously and consistently to different agents, different concerns, and different contexts. As such they are best defined as rules and constraints (declarative schemes).
  • Users’ stories (as narratives) are supposed to translate as soon as possible into business transactions. As such they are best defined as sequences of operations governed by users’ choices (procedural schemes).

Then, just like narratives are meant to carry out the plots, users’ stories are supposed to follow the paths set by business objectives. But if confusion is to be avoided between strategic orientations, regulatory directives, and opportunist moves, the walk of business objectives and the talk of users’ stories should be termed differently.

Business Objectives (Plots): Symbolic & Allochronic

The definition of business objectives has to find its terms between the Charybdis of abstractions and the Scylla of specific business processes, the former to be avoided because they are by nature detached from reality and only make sense with regard to models, the latter because they would be too specific and restrictive. In-between, business objectives would be best defined through:

  • Strategic and financial objectives expressed using symbolic categories applied to environments, products, and resources.
  • Modal time-frames identified in reference to events and qualified by assumptions with regard to symbolic categories.
  • Business functions to be optimized given a set of constraints.

These could be comprehensively and consistently expressed with declarative languages.

Users’ Stories (Narratives): Actual & Contemporaneous

Users’ stories are at their best when tied to specific circumstances and purposes without being led away by modeling concerns. As narratives they should stick to agents, triggering events, and scripted sequences of options, operations, and outcomes:

  • Compared to the symbolic categories used for business objectives, users stories should refer to actual subsets of objects and events defined on contexts.
  • Contrary to the modal time-frames of business objectives, the scripts of users’ stories must be fully timed with regard to their triggering events.

That can only be expressed as procedures.

From Fiction to Artifacts: Aligning Business Objectives & Enterprise Architectures

Likening business analysis to its distant literary kin goes beyond the metaphor as it points to a practical organization of business objectives and users’ stories.

And the benefits of the distinction between declarative (for business plots) and procedural (for users’ narratives) blueprints is not limited to business analysis but can be extended to systems architecture (as plots) and software design (as narratives). On that basis declarative schemes could be applied to business functions and architectures capabilities, and procedural ones to users’ stories (or use cases) and software design.

XBredModels_PlotsNarrs

On a broader perspective that approach can be used to frame enterprise architectures and business objectives.

Further Reading

External Links

Agile Business Analysis: From Wonders to Logic

Time and again new recruits will ask about the role of business analysts. Considering that such a question is seldom heard from software engineers, are BAs more curious about their job, or are they standing on more tentative grounds ? If that’s the case agility would help them to flip-flop between business quicksands to systems hard rocks.

vvv
How to make sense of business wonders (Hieronymus Bosch)

Holding the fort vs scouting outskirts

Systems architects and software engineers may have to meet esoteric business requirements, but their responsibility is first and foremost to guarantee the functional and economic sustainability of systems. On that account they are given licence to build solid walls and secure gateways, and to enforce their own languages and rules upon well vetted parties.

Business analysts don’t get such a free hand: while being straitened by software engineers constructs and constraints, their primary undertaking is to explore business wilds, reconnoitre competitors, trace new tracks, and learn the dialects of any nicknamed natives ready to trade.

No wonder the qualms of new business analysts.

Great businesses make their own rules

The best rules in business are the ones still unbeknownst, as success is most often brought by disruptive initiatives taking advantage of previously undiscovered opportunities. It ensues that at its core, BAs’ job description is to relentlessly look across the frontier for still uncharted businesses, and bring them back to the digitized world of shipshape business domains and processes.

For that purpose BAs will have to juggle with the fuzzy idiosyncrasies of new business openings until they can be aligned with the functionalities of “legacy” systems.

BA’s Agility

While usually presented as a software engineering hallmark, agility may be equally useful for business analysts as they have to balance two crossing perspectives:

  • Analysis: sorting detailed activities into business processes.
  • Synthesis: factoring out business functions and mapping them to systems capabilities.

That could be a challenging achievement if carried out sequentially: crossing back and forth between changing scope and steady capabilities could generate unsettling alternatives and unbounded complexity.

The agile development model is meant to tackle the difficulties through iterations and collaboration without being too specific about the kind of agility required from business analysts and software engineers.

Yet the apparent symmetry between the parties may be misleading: whereas software engineers don’t have (and shouldn’t even try) to second guess business analysts, business analysts shouldn’t forget that at the end of the day business expectations, however exotic or esoteric, will have to feed very conformist logical beasts.

Further Readings

Operational Intelligence & Decision Making

Preamble

According to a leading tools provider operational intelligence (OI) is the ability to “discover and analyze relationships between business events and corresponding IT events”.

Sigmar Polke
Operational Decision-making (Sigmar Polke)

From a marketing perspective, the moniker suggests some kind of cross-breeding between operational research, artificial intelligence, and real-time analytics. Yet, behind vendor dressing, problems and policies remain the ones traditionally dealt with by decision-making and knowledge management, and as far as marketing is concerned, pitches will hardly affect the assessment of field professionals.

Nevertheless, functional pitches may have a deeper influence if they try to outline the aims of operational intelligence to the people directly involved, affecting the way problems are understood and dealt with. That may be the case if business and system events are seemed to be put on a par: overlooking the directed dependency between actual events and their systems counterparts can critically hamper the very capabilities of systems decision-making.

Facts, Data, & Information

The new connected world of human brains and smart things have scaled down space and time by orders of magnitude, up to the point that events seem to come out as soon as they happen, wherever that may be. Facts and updates, that once were incoming as discrete and manageable batches of information, are now bursting continuously and massively as seamless streams of data that have to be processed on-the-fly into information lest they be cannibalized by ambient noise. That new configuration blurs the distinction between operational data (pushed, shallow, transient) and underlying information (pulled, deep, persistent), making it unworkable, if not meaningless altogether.

Taking inventories decisions as an example, traditional schemes rely on periodic readings of actual inventories and sales crossed with market foresight. Now, with on-line sales and the internet of things, real-time data can be used to build on-the-fly indicators whose biases and inaccuracy would be dynamically readjusted on the basis of information built on hindsight. At any given time (t), decision-makers will be presented with actual observations (a),  initial estimations of previous observations (b1, b2), and revised estimations of previous observations (c).

DataInfoKnow_OpIntel0
At any given time (t), decision-makers are presented with actual observations (a), initial estimations of previous observations (b1, b2), and revised estimations of previous observations (c).

Set along this framework, the debate about big data can be misleading as it puts the focus on the quantity of data feeding the processes, overlooking the process itself and the distinction between data, information, and knowledge.

Information, Knowledge, & Decision-making

Generally speaking, the distinction between data and information can be set with reference to time and context, data being instant and standalone, and information associated to a shelf life and domain. With regard to decision-making, it would mean that data can be directly used within the context of the current activities and circumstances; e.g, whereas on-line sales data may (or may not) be directly (i.e despite inaccuracies and biases) used to allocate inventories across depots, it has to be “mined” into consolidated information before being used in the broader perspective of inventories planning.

Compared to the transition between data and information, which is carried out by adding time and context, the one between information and knowledge is best understood in terms of decision-making.

Information is obtained by anchoring data to time-frames and contexts, knowledge is acquired by putting information to use.
Information is obtained by anchoring data to time-frames and contexts, knowledge is acquired by putting information to use.

Decisions are best defined as commitments set against some unknown circumstances: somebody, somewhere, or sometime. First, it ensues that decision-making calls for specific and timed information that has to be maintained up-to-date until decisions are taken. Then, taking decisions introduces some irreversible change in the state of affairs or expectations, making potentially obsolete all relevant information. So it may be argued that decisions is what transform information into knowledge.

Operational Intelligence: Objectives & Tools

Assuming decisions mark the nexus between information and knowledge, operational intelligence could be defined as the ability to put information to use, that ability being supported by the analysis of the relationships between business events and corresponding IT events.

Far from being academic, that distinction is essentially pragmatic as it marks the boundary between OI objectives and tools capabilities:

  • The aim of OI is to make sense (and profit) from the dynamic relationship between business (aka external) events on one hand, business objectives and enterprise capabilities on the other hand.
  • The role of supporting tools is to define and manage IT (aka internal) events used to reflect external ones and analyze them.

Whereas business events (red) represent change in the state of affairs, IT events (blue) only represent changes in associated information.
Whereas business events (red) represent change in the state of affairs, IT events (blue) only represent changes in associated information.

Since IT events are artifacts built on purpose there isn’t much to discover or analyze about them; not to mention the fact that confusing business events and their IT shadows is bound to undermine the whole decision-making process. So what is at stake for OI is how to design IT events as to timely and accurately trail the relevant business events.

Operational Intelligence & Actual Knowledge

As already noted, operational intelligence (OI) is about decision-making, which entails changing the state of objects, processes, or expectations. Compared to knowledge management (KM) which may or may not be time-related, OI is inherently bound to the actual state of affairs: on one hand it relies on specific and timed information, on the other hand it renders that information obsolete when it triggers decisions.

At the risk of oversimplification, operational intelligence can first be understood as a combination of traditional disciplines:

  • Data-mining is to filter facts and events, capture data, and analyze it into information.
  • Knowledge management chart information with regard to business objectives and enterprise capabilities.
  • Decision-making manage time-stamps and plan commitments subject to accuracy and likelihood.

But the specificity of operational intelligence is to be found in the way these functions are intertwined and cross-fed by operational concerns.

To begin with, data mining can be dynamically adjusted depending on what is needed for decision-making, and when. As a corollary, with the benefits of data so cooked in advance, some decisions can be taken directly, bypassing the mediation (and delays) of information processing. From a cognitive point of view that would be the equivalent of non symbolic (aka implicit) knowledge to be processed by neuronal networks.

Parceling out OI objectives
Decision-making and differentiated knowledge management

Conversely, information processing could benefit from operational feedback so that knowledge management would be driven by business value, and the supporting information weighted by timing and shelf-life considerations. Whereas part of it could be done through implicit connections, it would be more comprehensively and explicitly achieved through symbolic representations.

Operational Intelligence: Signals vs Symbols

Assuming that intelligence is the ability to figure out situations and solve problems, one may conclude that it is inherently operational. Along the same reasoning, if knowledge is information put to use, it may be implicit as well as explicit.

Nonetheless, the merit of operational intelligence is to bring to a single functional roof symbolic and non symbolic knowledge, the former explicit, using mediation of semantic constructs and used to weight information and support managed decisions, the latter implicit, using direct associations between actual objects or phenomena, and supporting automated decisions.

Further Readings

Data Mining & Requirements Analysis

Preamble

Data mining explores business opportunities and competitive advantage, requirements analysis considers supporting applications. Both use models, the former’s are predictive and ephemeral, the latter’s descriptive (or prescriptive) and perennial.

(Andreas Gursky)
Data mining: sorting business wheat from world chaff (Andreas Gursky)

As the generalization of digitized environment calls for more integration of business and software engineering processes, understanding the relationship between data mining and requirements analysis could significantly improve processes maturity and agility.

Data vs Requirements Analysis

Nowadays the success of a wide range of enterprises critically depends on two achievements:

  1. Mapping business models to changing environments by sorting through facts, capturing the relevant data, and processing the whole into meaningful and up to date information. That can be achieved through analysis models mapping business expectations to supporting systems.
  2. Putting that information into effective use through business processes and supporting systems. That is done through systems architecture and design models meant to prescribe how to build software artifacts.

Those challenges are converging: under the pressure of markets forces and technological advances most of traditional fences between business channels and IT systems are crumbling, putting the focus on the functional integration between data mining and production systems. That’s where predictive models can help by anchoring descriptive models to moving markets and by cross-feeding analysis and operations. How that can be achieved has been the bread and butter of good corporate governance for some time, but there has been less interest for the third branch, namely how data analysis (predictive models) could “inform” business requirements (descriptive models).

From Data to Information

Facts are not given but must be captured through a symbolic description of actual observations. That entails some observer set on task using a mix of conceptual and technical apparatus. Data mining and requirements analysis are practical realizations of that process:

  • Data mining relies on analytic tools to extract revealing information that could be used to chart opportunities along business models.
  • Requirements analysis relies on business processes and users’ practice to extract symbolic descriptions that will be used to build models of supporting applications.

If both walk the path from data to information, their objectives are different: the former’s is to improve business decisions by making sense of actual observations; the latter’s is to build system surrogates from the symbolic descriptions of actual business objects and activities.

Anchors & Structures: Plasticity of  Business Entities

Perhaps paradoxically, business agility calls for terra firma because nimble trades must be rooted in corporate identity and business continuity. As a consequence, the first step of requirements analysis should be to associate individuals business objects or activities with stable and consistent identification mechanisms, and to group them with regard to that mechanism:

  • External entities with natural (person) or designed identity (car).
  • Symbolic entities for roles (customer) or commitments (maintenance contract).
  • Actual activities (promotion campaign) and events (sale) or business logic (promotion).

Anchors
Anchors

Conversely, as the aim of data analysis is to explore every business angle, individual observations are supposed to be moved across groups; yet, since the units identified by data analysis will have to be aligned with the ones described by requirements analysis, moves must also keep track of identities. That dilemma between continuity of identified structures on one side, plasticity of functional aspects on the other side, can be illustrated by banks which, in response to marketing requirements, had to shift from account (internal identification) to customer (external identification) based systems.

From account (left) to customer (right) centered systems
It’s easier to market insurance from customer centered systems (right) than from account centered ones (left)

That challenge can be overcome by linking the identification of symbolic entities to external anchors.

Profiles & Features: Versatility of Business Opportunities

As noted above, requirements and data analysis are set on the same road but driven by different forces: the former tries to group individuals with regard to identification mechanisms before fleshing them out with relevant features; the latter tries to group individuals with given identities according to features and opportunity profiles. Yet, what could appear as collision courses may become a meeting of minds if both courses are charted with regard to variants analysis.

From the requirements perspective the primary concern is to distinguish between structural and functional variants:

  • Structural variants are bound to identities, i.e set up-front for the respective life-cycle of individual business objects or transactions. As a consequence they cannot be changed without undermining business continuity. Moreover, being part and parcel of descriptors (e.g  types and use cases) their change will affect engineering processes.
  • Functional variants may vary during the respective life-cycle of individual business objects or transactions. As a consequence they can be changed without undermining business continuity, and changes in descriptors (e.g partitions and scenarii) can be managed without affecting engineering processes.

From the data mining perspective the objective is to improve the benefits of information systems for decision-making processes:

  • Static: how to classify individuals as to reduce the uncertainty of predictions
  • Dynamic: how to classify business options as to reduce the uncertainty of decisions.

Since those objectives are set for individuals, constraints on continuity and consistency can be dealt with independently of the description of symbolic surrogates.

Identified individuals with profiles for customers (a), their behaviors (b), and conciliatory gestures (c)
Identified individuals with profiles for customers (a), their behaviors (b), and promotional gestures (c)

It ensues that perspectives can be adjusted by factoring out the constraints of continuity and consistency for business objects (e.g cars), agents (e.g customer) and processes (e.g repair). Profiles for agents (a), behaviors (b), and business options (c) could then be freely explored and tailored with regard to changes in business environment and objectives.

Applying Data Analysis to Requirements

Not surprisingly data analysis techniques can be used to adjust perspectives. For that purpose a sample of individuals (business objects and operations) representing the population targeted by requirements would have to be submitted to basic mining routines. Borrowing a catalog from F. Provost & T. Fawcett:

  1. Classification: estimates the probability for each individual (objects or operations) to belong to a set of classes; can be used to assess the closeness of the variants (respectively power-types or execution paths) identified by requirements analysis.
  2. Regression: reverse classification; estimates how much of individual features valuations can be explained by the proposed classifications.
  3. Similarity: a shallow version of classification; can be used to assess the distance between variants and consolidate the proposed classifications.
  4. Clustering: a deep version of classification; can be used to distinguish between shallow and natural classifications.
  5. Co-occurrence: deals with behavioral variants; can be used to distinguish between functional and structural classifications.
  6. Profiling: reverse of co-occurrence; can be used to consolidate functional and structural classifications.
  7. Links prediction: can be used to define relationships.
  8. Data reduction: eliminate redundant individuals; can be used to consolidate requirements and refine tests scenarii.
  9. Causal modeling: brings together business logic (events and rules) and users decisions; should provide the backbone of tests scenarii.

Besides the direct benefits for requirements, such procedures may help to bridge the span between data and requirements analysis and significantly improve processes’ capability and maturity level.

Business Objectives & Enterprise Architecture Capabilities

Data mining being first and foremost about competitive edge, it relies on a timely and effective coupling between enterprises capabilities and business opportunities. But the dilemma between continuity and plasticity described above for business objects and processes reappears at enterprise level: how to conciliate architecture, by nature perennial, with the agility needed to make the best of changing and competitive environments ?

As architectural big bang is arguably a last resort option, answers to that question must be progressive and local: if changes are to be swift and pertinent they must be both circumscribed and leveraged to the relevant parts of architecture. Taking an (amended) leaf of the Zachman framework, its sixth column (“Why” ) could be reset as a line for business and operational objectives that would cross the original five columns instead of the architecture layers. Using a pentagonal representation of enterprise architecture, that line would be set as circling the outer range.

Enterprise Architecture and the loci of change

It is worth to note that setting objectives on a line crossing the columns of capabilities instead of a column crossing the lines of layers means that objectives are set at enterprise level and their cascading impact traced and managed through layers.

Conceptual Models & Business Contexts

But even that updated framework doesn’t take into account the fundamental changes in business environments. Once secure behind organizational and technical fences, enterprises must now navigate through open digitized business environments and markets. For business processes it means a seamless integration with supporting applications; for corporate governance it means keeping track of heterogeneous and changing business contexts and concerns while assessing the capability of organizations and systems to cope, adjust, and improve.

As long as environments were a hotchpotch of actual and symbolic artifacts the pros and cons of integration could be balanced. But the generalization of digital flows and transactions has upended the balance: there is no more room or time for latency and enterprises must bring all symbolic representations (business, organization, and systems) under a common conceptual roof:

OpenConcepts_00
Conceptual models as bridges between business processes, and systems.

A canonical approach would be to introduce a conceptual indexing scheme open to extensions but with its footprint defined by business processes and systems functionalities. That would ensure a better integration of processes and supporting engineering but will do nothing for the modeling gap between enterprise architecture and external contexts. That could be achieved with ontologies.

Conceptual Loop: Ontologies & Business Intelligence

As far as data mining is concerned, three kinds of operations are to be considered:

  • Data understanding gives form and semantics to raw material.
  • Business understanding charts business contexts and concerns in terms of objects and processes descriptions.
  • Modeling consolidate data and business understanding into descriptive, predictive, and operational models.

OKBI_dmProcess
The aim of data mining is to refine raw data into meaningful information

While traditional approaches fall short when tasked with iterative modeling of unstructured data, ontologies may fare better because their explicit aim is only to describe what could exist in a domain of discourse:

  1. They are made of categories of things, beings, or phenomena; as such they may range from simple catalogs to philosophical doctrines.
  2. They are driven by cognitive (i.e non empirical) purposes, namely the validity and consistency of symbolic representations.
  3. They are meant to be directed at specific domains of concerns, whatever they can be: politics, religion, business, astrology, etc.

With regard to models, only the second point puts ontologies apart: contrary to models, ontologies are about understanding and are not supposed to be driven by empirical purposes. It ensues that ontologies can be understood as conceptual (aka canonical) models, used as such for business analysis and extended with purposes for systems analysis and design.

In addition to the integration with enterprise architectures, ontologies benefits for business intelligence would be twofold.

On one side, and whatever their use, ontologies could be aligned with the nature of contexts and their impact on business and enterprise governance e.g:

  • Institutional: mandatory semantics sanctioned by regulatory authority, steady, changes subject to established procedures.
  • Professional: agreed upon semantics between parties, steady, changes subject to established procedures.
  • Corporate: enterprise defined semantics, changes subject to internal decision-making.
  • Social: definedpragmatic semantics, no authority, volatile, continuous and informal changes.
  • Personal: customary semantics defined by named individuals.

OKnow_TabOntos
Ontologies, capabilities (Who,What,How, Where, When), and architectures (enterprise, systems, platforms).

On the other side ontologies could be defined according to the nature of targeted items, namely terms, documents, symbolic representations, or actual objects and phenomena. That would outline four basic concerns that may or may not be combined:

  • Thesaurus: ontologies covering terms and concepts.
  • Document Management: ontologies covering documents with regard to topics.
  • Organization and Business: ontologies pertaining to enterprise organization, objects and activities.
  • Engineering: ontologies pertaining to the symbolic representation of products and services.

KM_OntosCapabs
Ontologies: Purposes & Targets

On a broader perspective, ontologies could be influential in spanning the gap between explicit and implicit knowledge. In a few years’ time practically unlimited access to raw data and the exponential growth in computing power have opened the door to massive sources of unexplored knowledge which is paradoxically both directly relevant yet devoid of immediate meaning:

  • Relevance: mined raw data is supposed to reflect the geology and dynamics of targeted markets.
  • Meaning: the main value of that knowledge rests on its implicit nature; applying existing semantics would add little to existing knowledge.

Assuming that deep learning can transmute raw base metals into knowledge gold, ontologies would be decisive in framing the understanding, assessment, and improvement of the processes.

Operational Loop: Business Intelligence & Decision-making

Once carried out separately and periodically, decision-making is to be carried out iteratively at operational, tactical, and strategic level; while each level is to be set along its own time-frames, all are to rely on data-mining, with cycles following the same pattern:

  1. Observation: understanding of changes in business opportunities.
  2. Orientation: assessment of the reliability and shelf-life of pertaining information with regard to current positions and operations.
  3. Decision: weighting of options with regard to enterprise capabilities and broader objectives.
  4. Action: carrying out of decisions within the relevant time-frame.

Given business new playground, decision-making processes have to weave together material and digitized flows, actual contexts (aka territories) and symbolic descriptions (maps), and overlapping time-frames (operational tactical, strategic). That operational loop could then be coupled with the broader one of business intelligence:

OKBI_BIDM
Integration of  operational analytics, business intelligence, and decision-making.

Selected Readings