Knowledge Driven Test Cases

Preamble

The way tests are designed and executed is being doubly affected by development methods and AI technologies. On one hand well-founded approaches (e.g test-driven development) are often confined to faith-based niches; on the other hand automated schemes and agile methods push many testers out of their comfort zone.

Reality check (Kader Attia)

Resetting the issue within a knowledge-based enterprise architecture would pave the way for sound methods and open new doors for their users.

outline

Tests can be understood in terms of preventive and predictive purposes, the former with regard to actual products, the latter with regard to their forthcoming employ. On that account policies are to distinguish between:

  • Test plans, to be derived from requirements.
  • Test cases, to be collected from environments.
  • Test execution, to be run and monitored in simulated environments.

The objective is to cross these pursuits with knowledge architecture layers.

Test plans are derived from business requirements, either on their own at process level (e.g as users stories or activities), or combined with functional requirements at application level (e.g as use cases). Both plans describe sequences of actions meant to be performed by organizational entities identified at enterprise level. Circumstances are then specified with regard to quality of service and technical requirements.

Mapping tests to knowledge architecture layers.

Test cases’ backbones are built from business scenarii fleshed out with instances mimicking identified entities from business environment, and hypothetical decisions taken by entitled users. The generation of actual instances and decisions could be automated depending on the thoroughness and consistency of business requirements.

To be actually tested, business scenarii have to be embedded into functional ones, yet the distinction must be maintained between what pertains to business logic and what pertains to the part played by supporting systems.

By contrast, despite being built from functional scenarii, integration and acceptance ones are meant to be blind to business or functional contents, and cases can therefore be generated independently.

Unit and components tests are the building blocks of all test cases, the former rooted in business requirements, the latter in functional ones. As such they can be used to a built-in integration of tests and development.

TDD in the loop

Whatever its merits for phased projects, the development V-model suffers from a structural bias because flaws rooted in requirements, arguably the most damaging, tend to be diagnosed after designs are encoded. Test driven development (TDD) takes a somewhat opposite approach as code specifications are governed by testability. But reversing priorities may also introduce reverse issues: where the V-model’s verification and validation come too late and too wide, TDD’s may come hasty and blinkered, with local issues masking global ones. Applying the OODA (Observation, Orientation, Decision, Action) loop to test cases offers a way out of the dilemma:

  • Observation (West): test and assessment for component, integration, and acceptance test cases.
  • Orientation (North): assessment in the broader context of requirements space (business, functional, Quality of Service), or in the local context of application (East).
  • Decision (East): confirm or adjust the development paths with regard to functional scenarii, development backlog, or integration constraints.
  • Action (South): develop code at unit, component, or process levels.
From V to O: How to iterate across layers

As each station is meant to deal with business, functional, and operational test cases, the challenge is to ensure a seamless integration and reuse across iterations and layers.

managing Tests cases

Whatever the method, tests plans are meant to mirror requirements scope and structure. For architecture oriented projects, tests should be directly aligned with the targeted capabilities of architecture layers:

For architecture oriented projects test plans should be set with regard to capabilities.

For business driven projects, test plans should be set along business
scenarii, with development units and associated test cases defined with regard to activities. When use cases, which cover the subset of activities supported by systems, are introduced upfront for both business and functional requirements, test plans should keep the distinction between business and functional requirements.

Stories (bottom left), activities (right), use cases (top left).

All things considered, test cases are to be comprehensively and consistently run against requirements distinct in goals (business vs architecture), layers (business, functions, platforms), or formalism (text, stories, use cases, …).

In contrast, test cases are by nature homogeneous as made of instances of objects, events, and behaviors; ontologies can therefore be used to define and manage these instances directly from models. The example below make use of instances for types (propulsion, body), car model (Renault Clio), and car (58642).

Ontologies can then be used to manage test cases as instances of activities (development tests) or processes (integration and acceptance tests):

Test cases as instances of processes (integration) and activities (development)

The primary and direct benefit of representing test cases as instances in ontologies is to ensures a seamless integration and reuse of development, integration, and acceptance test cases independently of requirements context.

But the ontological approach have broader and deeper consequences: by defining test cases as instances in line with environment data, it opens the door to their enrichment through deep-learning.

knowledgeable test cases

Names may vary but tests are meant to serve a two-facet objective: on one hand to verify the intrinsic qualities of artefacts as they are, independently of context and usage; on the other hand to validate their features with regard to extrinsic circumstances, present or in a foreseeable future.

That duality has logical implications for test cases:

  • The verification of intrinsic properties can be circumscribed and therefore by carried out based on available information, e.g: design, programing language syntax and semantics, systems configurations, etc.
  • The validation of functional features and behaviors is by nature open-ended with regard to scope and time-frame; it ensues that test cases have to rely on incomplete or uncertain information.

Without practical applications that distinction has been of little consequence, until now: while the digital transformation removes the barriers between test cases and environment data, the spreading of machine learning technologies multiplies the possibilities of exchanges.

Along the traditional approach, test cases relies on three basic sources of information:

  • Syntax and semantics of programing languages are used to check software components (a)
  • Logical and functional models (including patterns) are used to check applications designs (b).
  • Requirements are used to check applications compliance (c).
Traditional (a,b,c) and knowledge driven (d,e,f) test cases

With barriers removed, test cases as instances can be directly aligned with environment data, opening doors to their enrichment, e.g:

  • Random data samples can be mined from environments and used to deal with human instinctive or erratic behaviors. By nature knee jerks or latent behavior cannot be checked with reasoned test cases, yet they neither occur in a void but within known operational of functional or circumstances; data analytics can be used to identify these quirks (d).
  • Systems being designed artifacts, components are meant to tally with models for structures as well as behaviors. Crossing operational data with design models will help to refine and hone integration and acceptance test cases (e).
  • Whereas integration tests put the focus on models and code, acceptance tests also involve the mapping of models to business and organizational concepts. As a corollary, test cases are to rely on a broader range of knowledge: external regulations, mined from environments, or embedded in organization through individual and collective skills (f).

Given the immersion of enterprises in digital environments, and assuming representing test cases as ontological instances, these are already practical opportunities. But the real benefits of knowledge based test cases are to come from leveraging machine learning technologies across enterprise and knowledge architectures.

FURTHER READING

Business Capabilities as Philosophers’ Stone ?

Preamble

To hear the buzz around business capabilities one would expect some consensus about basic principles as well as an established track record. Since there is little to be found on either account, should the notion be seen as a modern philosophers’ stone ?

People, System, Environment

Clear evidence can be found by asking two questions: what should be looked for, and why it can’t be found.

What is looked for

Business capabilities can be understood as a modern avatar of the medieval philosophers’ stone, a alchemical substance capable of turning base metals into gold.

In the context of corporate governance, that would mean a combination of assets blueprints and managing principles paving the way to success.

But such a quest is to err between the sands of businesses specificities and the clouds of accounting generalities.

At ground level enterprises have to mark their territory and keep it safe from competitors. Whatever the means they use (niche segments, effective organization, monopolistic situations, …,) success comes from making a difference.

With hindsight, revealing singularities may be discovered among the idiosyncrasies of business successes, but that can only be done from accounting heights, where capabilities are clouded by numbers.

why it can’t be found

The fallacy of a notion can also be established a contrario if, assuming the existence of a philosophers’ stone, the same logic would also demonstrate the futility of the quest.

Such a reasoning appears more like a truism when applied to business capabilities: insofar as business competition is concerned success is exclusive and cutting edges are not shared. It ensues that assuming business capabilities could be found, they would by the same move become obsolete and be instantly dissolved.

What should be looked for

As far as business environments are concerned, success is by nature singular and transient, and it consequently depends on sustaining a balancing act between assets and opportunities. To that end business capabilities should be understood as a hub between value chains and enterprise architectures.

fURTHER READING

Strategies: Augmented vs Virtual Realities

Preamble

The immersion of enterprises into digital environments and the merging of business and engineering processes are bound to impact traditional approaches to strategic planning.

Outlines & Time-frames (Annette Messager)

A detour into virtual and augmented reality may help to understand how the digital transformation is to affect the ways actual and future circumstances can be assessed and acted upon.

HORIZONS, Spaces, TIME

Strategic planning can be summarily defined as the alignment of two kinds of horizons:

  • External horizons are set on markets, competition, and technologies, with views and anticipations coming through data analytics and game theory.
  • Internal horizons are set on enterprise architectures, with business models and policies on one hand, organization and systems on the other. Both are set on purposes, the former aiming at specificity and edge, the latter at sharing and stability.

How to manage the alignment of controlled enterprise horizons with anticipated external ones

As far as the strategic alignment of means (internal horizons) and ends
(external horizons) are concerned, time is of the essence; that’s where is the fault line: whatever enterprise planned time-frames, they can only be mapped to fuzzy or unreliable ones outside. As a corollary, alignments have to be carried out either as a continuum, or through discrete and periodic adjustments to predictive models.

The future begins now

As suggested by Lewis Carroll’s Red Queen, the survival of enterprises in their evolutionary arms race doesn’t depend on their absolute speed set against some universal time-frame but on the relative one set by markets and competitors. Taking into account the role of a reliable and up-to-date basis for decision-making, strategic schemes can be characterized in terms of virtual and augmented reality:

  • Strategies set along predefined time-frames are by construct loosely tied to business environments since discrepancies are bound to develop during the intervals between anticipations (virtual reality) and realizations (actual reality). Hence the need of overall and planned adjustments.
  • By contrast, strategies set dynamically through OODA (observations, orientation, decision, action) loops can be carried out without overall anticipations; like augmented reality (AR) they mix fine-grained data observations and business intelligence with additional layers of information deemed to be relevant.
Seamless integration of operational analytics, business intelligence, and decision-making.
A seamless integration of OODA & Economic Intelligence

The first category has for long been a cornerstone of established strategic planning, with its implicit assumption of a conceptual gap between ‘Here and Now’ and future visions. But the induced latencies and discrepancies may become liabilities if enterprises have to continuously adjust representations and policies.

From Governance DIZZINESS to Strategic Missteps

As self-conscious organisms immersed into competitive environments enterprises survival depends on their awareness of changes.

For mammals such awareness is a biological capability: any discrepancy between perceived movements and their counterpart in the vestibular system is to generate motion sickness or vertigo.

Enterprises have no such built-in mechanism yet their awareness is nonetheless contingent on the alignment of representations with observations and orientations. As a corollary, out-of-kilter time-frames and outdated schemes may remain unnoticed, introducing governance dizziness and strategic missteps.

Obsolete representations (grey) are bound to generate dizziness and missteps

Such drawbacks can be limited with decision-making set along differentiated time-frames, e.g:

  • Operational decisions, to be carried out within processes time-frame (a).
  • Architecture decisions, for changes in assets affecting the design of business processes and supporting platforms (b).
  • Organizational decisions, for changes affecting roles and processes (c).
Weaving fine-grained threads across enterprise architectures and consistent time-frames.

That would allow the weaving of fine-grained policies across enterprise architectures and along consistent time-frames, mixing actual observations, representations of current and planned assets and resources, and anticipations of markets changes a competitors moves.

On that basis strategic alignments would not have to be set at fixed time for overall models, but could be managed continuously, with decision-making finely tuned for targets and timing:

  • External horizons: business decisions could be balanced by the need to know with the reliability of available information.
  • Internal horizons: decisions about organization and systems could be taken at the “last responsible moment”, i.e until not taking side could change the possible options.

Strategic planning could then be defined by crossing these rationales at the relevant granularity.

FURTHER READING


Digital Strategy

Preamble

Enterprise governance has to face combined changes in the way business times and spaces are to be taken into account. On one hand social networks put well-thought-out market segments and well planned campaigns at the mercy of consumers’ weekly whims. On the other hand traditional fences between environments and IT systems are crumbling under combined markets and technological waves.

The Strategic Objective is to Weave Systems and Knowledge Architectures (Wayne Thiebaud)

To overcome these challenges enterprises strategies should focus on four pillars:

  • Governance: the immersion of enterprises in digital environments and the crumbling of traditional fences require in-depth changes in information modeling and knowledge management schemes.
  • Data and Information: massive and continuous inflows of data calls for a
    seamless integration of data analytics (perception), information models (reasoning), and knowledge (decision-making).
  • Security & Confidentiality: new regulatory environments and costs of privacy breaches call for a clear distinction between data tied to identified individuals and information associated to designed categories.
  • Innovation: digital environments induces a new order of magnitude for the pace of technological change. Making opportunities from changes can only be achieved through collaboration mechanisms harnessing enterprise knowledge management to environments intakes.

FURTHER READING

Squared Outline: Business Capabilities

Despite (or because) their ubiquity across powerpoint presentations, business capabilities appear under a wide range of guises. Putting them in perspectives could clarify the issue.

To begin with, business capabilities should not be confused with systems ones as they are supposed to be driven by changing environments and opportunities, in contrast to continuity, integrity, and returns on investments. Were it not for the need to juggle with both there would be no need of chief “whatever” officers.

Then, if they are to be assessed across changing contexts and concerns, business capabilities should not be tied to specific ones but focus on enterprise wherewithal :

  • Material: capacity and maturity of platforms with regard to changes in business processes and environments.
  • Functional: capacity and maturity of supporting systems with regard to
    changes in business processes.
  • Organizational: versatility and plasticity of roles and processes in dealing with changes in business opportunities.
  • Intelligence: information assets and ability of people to use them.

These could then be adjusted with regard to finance and time:

  • Strategic: assets, to be deployed and paid for across a number of business exercices.
  • Tactical: resources, to be deployed and paid for within single
    business exercices.
  • Particular: combined assets and resources needed to support specific value chains.

The role of enterprise architects would then to plan, assess, and manage the dynamic alignment of business and architecture capabilities.

FURTHER READING

Focus: Data vs Information

Preamble

Distinctions must serve a purpose and be assessed accordingly. On that account, what would be the point of setting apart data and information, and on what basis could that be done.


From Data Stripes to Information Structure (Victor Vasarely)

Until recently the two terms seem to have been used indifferently; until, that is, the digital revolution. But the generalization of digital surroundings and the tumbling down of traditional barriers surrounding enterprises have upturned the playground as well as the rules of the game.

Previously, with data analytics, information modeling, and knowledge management mostly carried out as separate threads, there wasn’t much concerns about semantic overlaps; no more. Lest they fall behind, enterprises have to combine observation (data), reasoning (information), and judgment (knowledge) as a continuous process. But such integration implies in return more transparency and traceability with regard to resources (e.g external or internal) and objectives (e.g operational or strategic); that’s when a distinction between data and information becomes necessary.

Economics: Resources vs Assets

Understood as a whole or separately, there is little doubt that data and information have become a key success factor, calling for more selective and effective management schemes.

Being immersed in digital environments, enterprises first depend on accurate, reliable, and timely observations of their business surroundings. But in the new digital world the flows of data are so massive and so transient that mining meaningful and reliable pieces is by itself a decisive success factor. Next, assuming data flows duly processed, part of the outcome has to be consolidated into models, to be managed on a persistent basis (e.g customer records or banking transactions), the rest being put on temporary shelves for customary uses, or immediately thrown away (e.g personal data subject to privacy regulations). Such a transition constitutes a pivotal inflexion point for systems architectures and governance as it sorts out data resources with limited lifespan from information assets with strategic relevance. Not to mention the sensibility of regulatory compliance to data management.

Processes: Operations vs Intelligence

Making sense of data is pointless without putting the resulting information to use, which in digital environments implies a tight integration of data and information processing. Yet, as already noted, tighter integration of processes calls for greater traceability and transparency, in particular with regard to the origin and scope: external (enterprise business and organization) or internal (systems). The purposes of data and information processing can be squared accordingly:

  • The top left corner is where business models and strategies are meant to be defined.
  • The top right corner corresponds to traditional data or information models derived from business objectives, organization, and requirement analysis.
  • The bottom line correspond to analytic models for business (left) and operations (right).

Squaring the purposes of Data & Information Processing

That view illustrates the shift of paradigm induced by the digital transformation. Prior, most mappings would be set along straight lines:

  • Horizontally (same nature), e.g requirement analysis (a) or configuration management (b). With source and destination at the same level, the terms employed (data or information) have no practical consequence.
  • Vertically (same scope), e.g systems logical to physical models (c) or business intelligence (d). With source and destination set in the same semantic context the distinction (data or information) can be ignored.

The digital transformation makes room for diagonal transitions set across heterogeneous targets, e.g mapping data analytics with conceptual or logical models (e).

That double mix of levels and scopes constitutes the nexus of decision-making processes; their transparency is contingent on a conceptual distinction between data and information.

At operational level the benefits of the distinction are best expressed through what is commonly known as the OODA (Observation, Orientation, Decision, Action) loop:

  • Data is used to align operations (systems) with observations (territories).
  • Information is used to align categories (maps) with objectives.

Roles of Data (red) & Information (blue) in integrated decision-making processes

Then, the conceptual distinction between data and information is instrumental for the integration of operational and strategic decision-making processes:

  • Data analytics feeding business intelligence
  • Information modeling supporting operational assessment.

Not by chance, these distinctions can be aligned with architecture layers.

Architectures: Instances vs Categories

Blending data with information overlooks a difference of nature, the former being associated with actual instances (external observation or systems operations), the latter with symbolic descriptions (categories or types). That intrinsic difference can be aligned with architecture layers (resources are consumed, assets are managed), and decision-making processes (operations deal with instances, strategies with categories).

With regard to architectures, the relationship between instances (data) and categories (information) can be neatly aligned with capability layers, as represented by the Pagoda blueprint:

  • The platform layer deals with data reflecting observations (external facts) and actions (system operations).
  • The functional layer deals with information, i.e the symbolic representation of business and organization categories.
  • The business and organization layer defines the business and organization categories.

It must also be noted that setting apart what pertains to individual data independently of the informations managed by systems clearly props up
compliance with privacy regulations.


Architectures & Decision-making

With regard to decision-making processes, business intelligence uses the distinction to integrate levels, from operations to strategic planning, the former dealing with observations and operations (data), the latter with concepts and categories (information and knowledge).

Representations: Knowledge Architecture

As noted above, the distinction between data and information is a necessary counterpart of the integration of operational and intelligence processes; that implies in return to bring data, information, and knowledge under a common conceptual roof, respectively as resources, assets, and service:

  1. Resources: data is captured through continuous and heterogeneous flows from a wide range of sources.
  2. Assets: information is built by adding identity, structure, and semantics to data.
  3. Services: knowledge is information put to use through decision-making.

Ontologies, which are meant to encompass all and every kind of knowledge, are ideally suited for the management of whatever pertains to enterprise architecture, thesaurus, models, heuristics, etc.

CaKe_DataInfoKnow

That approach has been tested with the Caminao ontological kernel using OWL2; a beta version is available for comments on the Stanford/Protégé portal with the link: Caminao Ontological Kernel (CaKe_).

Conclusion: From Metadata to Machine Learning

The significance of the distinction between data and information shows up at the two ends of the spectrum:

On one hand, it straightens the meaning of metadata, to be understood as attributes of observations independently of semantics, a dimension that plays a critical role in machine learning.

On the other hand, enshrining the distinction between what can be known of individuals facts or phenomena and what can be abstracted into categories is to enable an open and dynamic knowledge management, also a critical requisite for machine learning.

FURTHER READING

External Links

Squared Outline: Models As Currency

As every artifact, models can be defined by nature and function. With regard to nature, models are symbolic representations, descriptive (categories of actual instances) or prescriptive (blueprints of artifacts). With regard to function, models can be likened to currency, as they serve as means of exchange, instruments of measure, or repository.

Along that understanding, models can be neatly characterized by their intent:

  1. No use of models, direct exchange (barter) can be achieved between business analysts and software engineers.
  2. Models are needed as medium supporting exchange between organizational units with different business or technical concerns.
  3. Models are used to assess contents with regard to size, complexity, quality, …
  4. Models are kept and maintained for subsequent use or reuse.

Depending on organizations, providers and customers could then be identified, as well as modeling languages.

FURTHER READINGS