vrijdag 20 mei 2016

Afterthoughts on Data Governance for BI

Why Business Intelligence needs a specific approach to data governance


During my talk at the Data Governance Conference, at least one of my audience was paying attention and asked me a pertinent question. “Why should you need a separate approach for data governance in Business Intelligence?”

My first reaction was “’Oops, I’ve skipped a few stadia in my introduction…” So here’s an opportunity to set things right.

Some theory, from the presentation


At  the conference, I took some time to explain the matrix below.
the relevance of data for decision making
Data portfolio management as presented at the 2016 data governance conference in London

If you analyse the nature of the data present in any organisation, you can discern four major types.
Let’s take a walk through the matrix in the form of an ice cream producer.
Strategic Data: this is critical to future strategy development; both forming and executing strategy are supported by the data. By definition almost, strategic data are not in your process data or at best are integrated data objects from process data and/or external data. A simple example: (internal) ice cream consumption per vending machine matched with (external) weather data and an (external) count of competing vending machines and other competing outlets create a market penetration index which in its turn has a predictive value for future trends.
Turnaround Data: critical to future business success as today’s operations are not supported, new operations will be needed to execute. E.g.: new isolation methods and materials make ice cream fit for e-commerce. The company needs to assess the potential of this new channel as well as the potential cannibalizing effect of the substitute product. In case the company decides not to compete in this segment, what are the countermeasures to ward off the competition? Market research will produce the qualitative and quantitative data that need to be mapped on the existing customer base and the present outlets.
Factory Data: this is critical to existing business operations. Think of the classical reports, dashboards and scorecards. For example: sales per outlet type in value and volume, inventory turnover… all sorts of KPIs marketing, operations and finance want every week on their desk.
Support Data: these data are valuable but not critical to success. For instance reference data for vending locations, ice cream types and packaging types for logistics and any other attribute that may cause a nuisance if it’s not well managed.
If you look at the process data as the object of study in data governance, they fall entirely in the last two quadrants.

They contribute to decision making in operational, tactical and strategic areas but they do not deliver the complete picture as the examples clearly illustrate. There are a few other reasons why data governance in BI needs special attention, If you need to discuss this further, drop me a line via the Lingua Franca contact form.

dinsdag 29 maart 2016

Data Governance in Business Intelligence, a Sense of Urgency is Needed

The Wikipedia article on data governance gives a good definition and an overview of the related topics. But although you may find a few hints on how data governance impacts the business intelligence and analytics practice, the article is living proof that the link data governance with BI and Analytics is not really on the agenda of many organisations.

Sure, DAMA and the likes are reserving space in their Body of Knowledge for governance but it remains on the operational level and data  governance for analytics is considered a derived result from data governance for on line transaction processing (OLTP). I submit to you that it should be the other way around. Data governance should start from a clear vision on what data with which degree of consistency, accuracy and general quality measures to support the quality of the decision making process is needed. In a second iteration this vision should be translated into a governance process on the source data in the OLTP systems. Once this vision is in place, the lineage from source to target becomes transparent, trustworthy and managed for changes. Now the derived result is compliance with data protection, data security and auditability to comply with legislation like Sarbanes Oxley or the imminent EU directives on data privacy.

Two observations to make my point

Depending on the source, between 30 and 80 percent of all Business Intelligence projects fail. The reasons for this failure are manifold: setting expectations too high may be a cause but the root cause that emerges after thorough research is a distrust in the data itself or in the way data are presented in context and defined in their usability for the decision maker. Take the simple example of the object “Customer”. If marketing and finance do not use the same perspective on this object, conflicts are not far away. If finance considers anyone who has received an invoice in the past ten years as a customer, marketing may have an issue with that if 90 % of all customers renew their subscription or reorder books within 18 months.  Only clear data governance rules supported by a data architecture that facilitates both views on the object “Customer” will avoid conflicts.
Another approach: only 15 – 25 % of decision making is based on BI deliverables. On the plus side it may mean that 75 % of decision making is focused on managing uncertainty or nonsystematic risk which can be fine. But often it is rather the opposite: the organisation lacks scenario based decision making to deal with uncertainty and uses “gut feeling” and “experience” to take decisions that could have been fact based, if the facts were made available in a trusted setting.

Let’s spread the awareness for data governance in BI


Many thanks in advance!

vrijdag 11 maart 2016

May I have three minutes of your time?



But I need your help...

To asses the present state of art in Data Governance and Analytics: how are data definitions, formats, locations, security, privacy and other aspects governed for analytical purpose? But most of all, why are you governing data and what is the level of data governance in your organisation?

Get Lingua Franca’s Presentation on the Data Governance Conference Europe 2016

“How Data Governance Works with BI”

But before we send you the proceeds of the conference, we ask you for a favour in return.
Fill in four answers on a questionnaire you can find here. We expect about 400 answers from all industries in the EU and the Americas. A high level report will be integrated in our presentation but you will get the full report if you tick the box on the form. And rest assured, you will not be spammed with offers or other unwanted solicitations!

Many thanks in advance!

Bert


dinsdag 16 februari 2016

zondag 22 november 2015

Book Review: Business Analysis

3rd Edition, edited by Debra Paul, James Cadle and Donald Yeates

Preamble: the island and the continental species

When BCS, the Chartered Institute for IT deems a book worth publishing, it is certainly worth reviewing from a continental point of view. Why? Because experience shows that the UK’s  business analyst has not exactly  the same profile as the variety on the mainland.
On the British Isles, a business analyst covers a much wider scope: “One of the most important aspects of a business analysis project is to decide what the focus is and which areas need to be investigated. For example, on some projects the focus may be to explore possible improvements on how part of the organization works. In this case, we might begin by examining all of the current working practices, including the staffing and job roles, and the work may focus on analysing and evaluating the options for the future business system. Another project may focus on the IT system needs and whilst understanding the situation and all of the stakeholder perspectives is important, the potential for the use of IT to improve the business system will dominate the analysis.” (p .59)
Clearly, the island species covers a far broader scope than the continental one. Of the hundreds of business analysts I have met on projects, in training courses and seminars, ninety percent come from an IT background. In the application or OLTP world, I have met with dozens of ex-developers who became functional analysts and expanded their horizon towards business analysis. In the OLAP or analytics world, there is dominant share of DBAs who became business analysts. Suddenly I realise that I am more of an island species as I evolved from sales, marketing and finance into business analysis and studied computer science to make sure I can communicate with the designers and developers.

A comprehensive introduction

The editors take you on a journey through the analysis practice, defining the concept, the competencies and introducing strategy analysis, business analysis as a process, touching the investigation techniques and introducing stakeholder analysis. After modelling the business process, defining the solution and making the business and financial case, the requirements are discussed as well as a brief introduction to modelling the requirements and delivering the requirements and the business solution.
Delivering this body of knowledge in fourteen chapters on 280 pages indicates this book is a foundation for practitioners.

Models, models and… models


The 280 page book is packed with models, 112 of them are illustrated and explained as well as integrated in a logical process flow of the business analysis practice.
In that sense, the foreword of president of the IIBA UK Chapter, Adrian Reed, hits the spot when he calls it “an extremely useful resource that will referenced by new and experienced practitioners alike”.
Novice analysts can use this book as an introduction to the business analysis practice in the broadest sense while experienced business analysts will consider it a valuable placeholder for useful frameworks, concepts and material for further study. The Reference and Further Reading sections at the end of each chapter contain extremely useful material.  With regards to “further reading”  there is a caveat I need to share with you. It‘s not about the book itself but more about models in general.

A caveat about models

Let me tell you a little story from my marketing practice to illustrate my point.
A very familiar model in portfolio management is the Boston Consultancy Group’s  Share Matrix. It is used on a strategic level to analyse business units and in the marketing practice, the product portfolio is often represented and analysed via this model.
For those not familiar with the model, here’s a little reference to the theory: https://en.wikipedia.org/wiki/Growth%E2%80%93share_matrix
When I worked for a multinational FMCG company I discovered what I called “Cinderella brands”. These were brands with a small market share, low growth and considered a dead end street for the marketer’s career. You could find product managers with little ambition in that position, fixing up and manoeuvring to keep the brand afloat while people higher up in the organization where waiting for the right moment to axe the brand. I managed to convince the people with the axe that an appropriate marketing approach cold not just save the brand but grow it into a profitable niche product, sometimes contributing more than their so-called cash cows. We built the business case on processed cheese with a budget on a shoestring and proved our point that a model can never take over from thorough analysis and critical thinking. After that, nobody mentioned “dogs” anymore, “Cinderella” became the household name for forgotten brands with unrealized potential. (And we got much more business from the multinational.)
The illustration below from an academic author shows exactly what can go wrong when models take over from scrutiny and  critical thinking.

These are the questions to ask when you look at a growth share market:
·          Who says cash cows don’t need substantial investment to maintain their dominant market share and keep up with market growth? Ask Nokia if you doubt it.
·           Who says dogs need to have a negative cash flow? Sure,  if your marketing spend is based on the same mental models as those for stars and cows you will be right but guerrilla marketing techniques may prove the  opposite.
·           Who says stars’ growth will continue for eternity? Ever read “Crossing the Chasm” by Geoffrey Moore? Especially in high tech marketing, novelties may only appeal to the techies but never reach the mainstream market…
In fact, question marks are in the only quadrant in the above model where some form of nuance can be observed…  Notice the expression “analyse … whether…”
In conclusion:  follow the editors’ further reading advice. It will help you to become a mature business analyst providing your customers not only the “know what” and some of the “know how” as described in the book, but also the “know why”. Wisdom may be harder to quantify but its value is beyond doubt in the business analysis practice. By the way, from the same editor, I recommend “Business Analysis Techniques” to increase your know how.

Regular updates needed

The business analysis practice evolves rapidly and the only criticism I can come up with is the lack of an accompanying website with extra updates and reference material. Let me add at least two of them:  a benefit map and the business canvass models are very much in the business analysis practice today.
To conclude, all you continental business analysts out there, buy the book and increase your knowledge by an order of magnitude.
Available at http://shop.bcs.org, paperback ISBN: 978-1-78017-277-4

vrijdag 24 juli 2015

The Future of Information Systems: Design from the Data

This third post in a series of three on BI programme management looks at a new way of designing systems for both transaction  and decision support to improve the organisation’s effectiveness further. I will examine the concept of BI architecture further and give hints of how BI programme management can evolve towards an ideal architecture which merges transaction and decision support systems in a powerful ensemble, ready for the new economic challenges.
I propose an “Idealtyp” knowing that no existing organisation can achieve this in less than a decade for reasons like sunk cost fallacies, the dialectics of progress and simply resistance to change.

But new organisations and innovators who can make the change will notice that the rewards of this approach are immense. They will combine architectural rigidity with business agility and improve their competitive power with an order of magnitude.

Why a BI Architecture is Necessary


I am a fan of Max Weber’s definition of “Idealtyp”[i], which has direct links with architecture in information technology. BI architecture is an abstraction of reality, and as such an instrument to better understand a complex organisation of hardware, network topologies, software, data objects, business processes, key people and organisational units. All these components interact in –what appears to outsiders- in a chaotic way. An architectural framework brings order to the chaos and provides meaning to all the contributors to the system.
Architecture is used as a benchmark, a to be situation by which the present state of nature can be measured. It is a more crisp and more manageable concept than CMM-like models which express maturity sometimes in rather esoteric terms. For a quick scan, this will do but for in-depth managing of the above mentioned BI assets, an architectural framework is better for BI environments.


CMM Level
BI symptoms
 Principal risks
 Initial
 A serious case of “spreadsheetitis”: every decision maker has its own set of spreadsheet files to support him in his battles with the other owners of spreadsheets. Everyday tugs of war over who has the correct figures.
Your project may never take off because of political infighting and if it does, there will be a pressing need for change management of the highest quality and huge efforts will have to be invested in adoption tracks.
 Repeatable
 The organisation uses some form of project management, in most cases inherited or even a carbon a copy of systems or application development
The project management method may be totally inadequate for a BI project leading to expensive rework and potential project failure in case everybody remains on his position.
 Defined
The organisation has a standard procedure for the production of certified reports. These can connect with one or more source systems in a standardised way: direct connection to the source tables, import of flat files, or some form of a data warehouse.
Resistance to change.
This depends on the way the organisation has implemented the data warehouse concept and how reversible the previous efforts are in a migration scenario.

 Managed
The development processes are standardised and monitored using key performance indicators and a PDCA cycle.
The iterative and explorative approach of BI project management may frighten the waterfall and RAD fans in the organisation. Make sure you communicate well about the specifics of a BI development track.
 Optimising
The development processes only need fine-tuning.
Analysis paralysis and infighting over details may hamper the project’s progress.

Table 2 Example of the BI version of the Capability Maturity Model as described in Business Analysis for Business Intelligence on page 202. In the book, it is positioned as a tool to help the BA with identifying broad project management issues

Why this "Idealtyp" is not Easy to Achieve


Proposing an ideal BI architecture is one thing, achieving it, another. I will only mention three serious roadblocks on the path towards this ideal BI architecture that unifies transaction systems and decision support systems: the sunk cost fallacy, the dialectics of progress and resistance to change.

The sunk cost fallacy is a powerful driver in maintaining the status quo; organisations suffering from this irrational behaviour consider they have invested so much effort, money, hardware, training, user acceptance and other irretrievable costs that they should continue to throw good money at bad money.  And sometimes the problem is compounded when the costs were spent on technology from market leaders.
No one ever got fired for buying… (fill in any market leader’s name)

No matter what industry you look at, market leaders fulfil their basic marketing promise: provide stability, predictable behaviour and a very high degree of CYA (google it) to the buyer. But that doesn’t mean the purchase decision is the best possible decision for future use. Market leaders in IT are also very keen on “providing” vendor lock-in, disallowing the client to adapt to changing requirements.
As a footnote: today, buyers are more looking at the market cap or the private equity of the Big Data technology providers than at their actual technical performance and their fit with the organisation’s requirements. Yes, people keep making the same mistakes over and over…

At the other end of the spectrum are the dialectics of progress:  this law was discovered by the Dutch journalist Jan Romein who noticed that gas lights were still used in London when other European capitals already used electricity.  This law suggests-and I quote an article on Wikipedia-  that making progress in a particular area often creates circumstances in which stimuli are lacking to strive for further progress. This results in the individual or group that started out ahead eventually being overtaken by others. In the terminology of the law, the head start, initially an advantage, subsequently becomes a handicap.
An explanation for why the phenomenon occurs is that when a society dedicates itself to certain standards, and those standards change, it is harder for them to adapt. Conversely, a society that has not committed itself yet will not have this problem. Thus, a society that at one point has a head start over other societies, may, at a later time, be stuck with obsolete technology or ideas that get in the way of further progress. One consequence of this is that what is considered to be the state of the art in a certain field can be seen as "jumping" from place to place, as each leader soon becomes a victim of the handicap. 
(From:  https://en.wikipedia.org/wiki/Law_of_the_handicap_of_a_head_start)

As always, resistance to change plays its role. New tools and new architectures require new skills to be trained, new ways of working to adopt and if one human species has trouble adapting to new technologies it is… the tech people. I can produce COBOL programmers who will explain to you that COBOL is good enough for object oriented programming or IMS specialists who see nothing new in the Big Data phenomenon…


What is BI Architecture?

Here’s architecture explained in an image. Imagine Christopher Wren would have disposed of modern building technologies. Then either the cathedral, based on the architecture “as is” would have looked completely different, with higher arches, bigger windows, etc… Or,… the architecture could have evolved as modern technology would have influenced Wren’s vision on buildings.
Exactly this is what happens in BI architecture  and BI programme management.

Figure 5 On the left: architecture, right: a realisation of architecture as illustrated by Wren’s Saint-Paul’s Cathedral


Architecture descriptions are formal descriptions of an information system, organized in a way:
  • that supports reasoning about the structural and behavioural properties of the system and its evolution.
  • These descriptions define the components or building blocks that make up the overall information system, and
  • They provide a plan from which products can be procured, and subsystems developed,
  • that will work together to implement the overall system.
  • It thus enables you to manage your overall IT investment in a way that meets the needs of your business.
It is also the interaction between structure, which is requirements based, and principles applicable to any component of the structure.

What is the Function of BI Architecture? 

BI Architecture should reflect how the BI requirements are realized by services, processes, and software applications in the day-to-day operations. Therefore, the quality of the architecture is largely determined by the ability to capture and analyse the relevant goals and requirements, the extent to which they can be realized by the architecture, and the ease with which goal and requirements can be changed. 

Figure 6 The Open Group Architecture Framework puts requirements management at the centre of the lifecycle management. The connection with business analysis for business intelligence is obvious. 


Reality Check: the Two Worlds of Doing and Thinking

Now we have established a common view on BI architecture and programme management, it is time to address the murky reality of everyday practice.
Although Frederick Taylor and Henri Fayol’s ideas of separation between doing and thinking have been proven inadequate for modern organisations, our information systems still reflect these early 20th Century paradigms. You have the transaction systems where the scope is simply: execute one step after another in one business process and make sure you comply with the requirements of the system. This is the world of doing and not thinking. Separated from the world of doing is the world of thinking and not doing: decision support systems. The business looks at reports, cubes and analytical results extracted from transaction and external data and then makes decisions which the doers can execute.
What if the new economy were changing all this in a rapid pace? What if doing and thinking came together in one flow? That’s exactly what the Internet is creating, and I am afraid the majority of organisations are simply not ready for this (r)evolution. Already in 1999, Bill Gates and Collins Hemingway[ii] wrote about empowering people in the digital age when they gave us the following business lessons:
  • q  The more line workers understand the inner workings of production systems, the more intelligently they can run those systems.
  • q  Real-time data on production systems enables you to schedule maintenance before something breaks.
  • q  Tying compensation to improved quality will work only with real-time feedback of quality problems.
  • q  Task workers will go away. Their jobs will be automated or combined into bigger tasks requiring knowledge work.
  • q  Look into how portable devices and wireless networks can extend your information systems into the factory, warehouse and other areas.

I am afraid this advice still needs implementation in many organisations. The good news is that contemporary technologies can support the integration of doing and thinking. But it will require new architectures, new organisational and technological skills to reap maximum benefits from the technology.

The major and most relevant BI programme management decision criterion will be the answer to the question: “Which quality data yield the highest return in terms of competitive advantage?


Bringing IT Together: Design from the Data


What if we considered business processes as something that can change in 24 hours if the customer or the supplier wants it? Or if competitive pressure forces us to change the process? What if information systems would have no problem supporting changing business processes because the true cornerstone, surviving any business process is data? This could be a real game changer for industries that still consider data as a product of a business process instead of the objective of that process.
The schema below describes a generic architecture integrating transaction and decision support systems in one architectural vision. Let’s read it from left to right.
Any organisation has a number of business drivers, for example as described by Michael Porter’s generic strategies: be the cost leader, differentiate from the competition or focus on a niche. Parallel with the business drivers are decision making motives such as: “I want complete customer and product insight” and finally, the less concrete but very present knowledge discovery driver to make sure organisations are always in the lookout for unpredictable changes in the competitive environment. These three drivers define a number of business objects, both static and dynamic. And these entities can be endogenous to the organisation (like customer, channel, product, etc..) or they can be external like weather data, currency data, etc…. These business objects need to be translated into data objects suitable for transaction and decision support

Figure 7 This is the (condensed) target architecture of an integrated  “Big Data Warehouse”: combining batch and stream processing using low latency for operational intelligence and aggregate data for tactical and strategic decision making. Built from the ground up using data in stead of business processes as the analytic cornerstone.

Conclusion: an integrated view on transactions and decision making will improve BI programme management supported by this architectural vision. The major and most relevant BI programme management decision criterion will be the answer to the question: “Which quality data yield the highest return in terms of competitive advantage?” And thus, which project (whether on the transaction or decision support systems need the highest priority in allocation of resources? 



[i] According to the excellent website http://plato.stanford.edu/entries/weber/  this is the best description of Max Weber’s definition:
“The methodology of “ideal type” (Idealtypus) is another testimony to such a broadly ethical intention of Weber. According to Weber's definition, “an ideal type is formed by the one-sided accentuation of one or more points of view” according to which “concrete individual phenomena … are arranged into a unified analytical construct” (Gedankenbild); in its purely fictional nature, it is a methodological “utopia [that] cannot be found empirically anywhere in reality”. Keenly aware of its fictional nature, the ideal type never seeks to claim its validity in terms of a reproduction of or a correspondence with reality. Its validity can be ascertained only in terms of adequacy, which is too conveniently ignored by the proponents of positivism. This does not mean, however, that objectivity, limited as it is, can be gained by “weighing the various evaluations against one another and making a ‘statesman-like’ compromise among them”, which is often proposed as a solution by those sharing Weber's kind of methodological perspectivism. Such a practice, which Weber calls “syncretism,” is not only impossible but also unethical, for it avoids “the practical duty to stand up for our own ideals”.”

What is less known is that Weber used the concept also in decision making theory when he analysed the outcome of the Battle of Köninggratz, where Von Moltke defeated the Austrian-Bavarian coalition against Prussia and its allies in 1866, an important phase in the unification of Germany.


[ii] “Business at the Speed of Thought” Bill Gates and Collins Hemingway, Penguin Books, London England, 1999 pp 293 -294