Monday, March 2, 2015

What Is Business Intelligence 3.0?

Interactive Visual Analytics represents what Gartner Research calls the consumerization of Business Intelligence.  It’s a good example of a disruptive innovation or, as Qlik’s Donald Farmer calls it, a market changer.

According to
Tableau Software “visual analysis is not a graphical depiction of data. Virtually any software application can produce a chart, gauge or dashboard. Visual analytics offers something much more profound. Visual analytics is the process of analytical reasoning facilitated by interactive visual interfaces”

This new generation of BI tools is so intuitive to the regular user that little or no training is necessary to explore data. This is a great feature since, according to Gartner, BI users do not want to read manuals. They demand intuitive BI interfaces, in line with the internet experience they are accustomed to, like Google searches or smart phone apps.

The current 
low BI utilization rate of about 5% does not provide many companies an acceptable return on their Business Intelligence investment.  The new user-friendly and intuitive visual analytic tools are helping companies exploit the treasure of business trends, patterns and opportunities hidden in their oceans of data by increasing the number of employees that participate in the data discovery process. The technology enables casual users to transition into power-users and power-users into app developers in a matter of weeks.

Depending on the vendor, the new software class is known by different names: Data Discovery, Advanced Visualization, Visual Analytics, Business Discovery, Self-Serve Business Intelligence or Business Intelligence 3.0
Advanced data visualization is based on the fact that 70% of the human sensory receptors are dedicated to vision while the other four senses share the remaining 30%.  In addition, our brains are much more effective recognizing shapes trends, patterns and colors than analyzing spreadsheets or tables full of numbers. Visual data analysis principles are based on the work of Edward Tufte and Stephen Few.

The data visualization market begun to grow during the last decade as business users started purchasing these applications for departmental use, mainly without IT consent.  The reason was simple: business analysts needed the capability to analyze all sorts of data rapidly, beyond the scope of the data-warehouse.  A request that most IT organizations were not prepared to fulfill. Today as data discovery tools have become more popular and scalable IT organizations are more involved in the purchasing process. 
Today more than two dozen applications fall into this category from companies around the globe. The leading ones –Tableau and QlikView- have grown their acceptance at a very fast rate.  Ten visual applications have made it to Gartner Research's 2015Magic Quadrant for Business Intelligence and Analytics and this number will grow in the future as more BI solutions continue to add visual analysis functionality.  Gartner estimates that more than half of net new purchasing is data-discovery-driven.
"For years, data discovery vendors — such as QlikTech, Salient Management Company, Tableau Software and Tibco Spotfire — received more positive feedback than vendors offering OLAP cube and semantic-layer-based architectures.  In 2012, the market responded:
·        MicroStrategy significantly improved Visual Insight.
·        SAP launched Visual Intelligence.
·        SAS launched Visual Analytics.
·        Microsoft bolstered PowerPivot with Power View.
·        IBM launched Cognos Insight.
·        Oracle acquired Endeca.
·        Actuate acquired Quiterian".

To be clear, Self-serve BI does not mean “IT Free” as a strong IT-Business partnership is always helpful to ensure data quality through proper governance and also to maintain the proverbial single version of the truth.  Self-serve refers to the user’s ability to perform data exploration and discovery simply by clicking or tapping into interactive dashboards and reports.  All this without having to request IT to create specific data marts, build OLAP cubes or predefined reports as this would delay the data analysis process.

Also, it’s important to highlight that there are two kinds of self-serve BI user: 
  1. Analytics Power Users who create visual apps from multiple data sources –both internal and external.
  2. Regular Users that can fully explore the visual apps created by power users or IT.
In addition to the typical functionality of multidimensional analysis (drill-down, drill-through, roll-up, sort, group, filter and calculations) some visual tools offer “what-if” scenario analysis, data animation, integration with the statistical ”R” program and mobile capability. 
A great feature of this new generation of BI software is its data blending functionality. These applications allow connecting simultaneously to disparate types of data bases or tables, whether in a data warehouse, data-marts, spreadsheets, text files, Microsoft Access, websites and in the case of Tableau OLAP cubes.

The analytic results are instantaneous since the process takes place in-memory (RAM).  Additionally the visual interface, when used proficiently, permits to digest huge amounts of information and visualize trends and patterns in seconds.  This process enables what many call “analysis at the speed of thought”; Meaning that the answers to business questions can be found fast enough without  interrupting the “train of thought” that leads to the next layer of questions, seeking to find the root cause of issues or opportunities.

Visual Analytics packages are not replacing traditional BI in large organizations but complementing them. They provide fast analytical capabilities to more people that need to gain a competitive edge in the current fast changing market dynamics.

For small and medium size companies that haven’t yet invested in BI, Self-serve, Visual Data Discovery is a cost effective solution that can be deployed very fast.

This new generation of BI software takes descriptive analytics to a whole new level. That's the reason for the fast growth rate this $4.5 Billion market segment has experienced during last few years.

It only takes a few minutes to download the free versions most vendors offer for testing purposes.
Below: Dr. Hans Rosling’s video is a few years old but still illustrates the power of data visualization techniques that make 120,000 data-points tell a compelling story in a way that’s very easy to understand. 

Note: The original version of this article was published here by Bill CabirĂ³ in 2011.

Image: GraphicStock

Wednesday, February 11, 2015

Are You a Business Intelligence Avoider?

From six different studies we can conclude that approximately only 5% of employees use BI tools to perform Analytics effectively.

I think this is in part related to the fact that except for folks with backgrounds in science, engineering, economics or finance; many people across the company 
do not feel too comfortable around numbers, math or logic functions.  I've observed this during years of training corporate employees on the strategic use of Business Intelligence and Analytics.

This is how the numbers work: close to 50% of employees have access to BI tools, 
about 20% of them actually use BI, and about half of them (5%) are in the analytical / power user categories.

Despite recent inroads of the more user friendly BI visualization tools of the last few years, the BI Scorecard’s annual survey of users, administrators, and directors reports flat utilization rate since 2006.

If you were to plot a histogram showing 
only those 20% actual BI users in an organization, it would probably approach a normal distribution (bell shape curve) consisting of the following categories, where close to two thirds of the users would fall in buckets 3 and 4.

1) Non Users: Run canned reports once a quarter or less frequently. These are people who are either math averse; do not like computers, or are executives that have their assistants run and print static reports for them.

2) Infrequent Report Users: Run canned reports about once a month.

3) Frequent Report Users: Run canned reports on a weekly or daily basis.

4) Analytic Users: Modify static reports and OLAP cube views by grouping, sorting, formatting as well as changing some dimensions and measures. These folks save their new customized reports for future use.

5) Power BI Users: Create new ad-hoc reports from scratch applying multiple dimensions, measures and using grouping, sorting and filters. These users travel interactive dashboards and OLAP cubes from corner to corner using drill down, drill through, and most of the available custom features in search of the root causes of both: problems and opportunities.  Power users create and share reports, dashboards and visualizations with folks in the same department.

6) Expert Analysts: Search find and provide new data bases, blend disparate data, design and build customized cubes, pivot tables and dashboards, perform statistical, financial or marketing analyses and usually export results to MS Excel to complete the last analytical mile.  Expert analysts create and share 
reports, dashboards, visualizations and analyses with management across the organization.

Buckets 4, 5 and 6 represent the 5% of employees that use BI tools to perform analytics effectively.   When it comes to Advanced Analytics (statistics, predictive modeling, data mining, etc.), data scientists or statisticians are probably close to 10% of that number or just about 0.5%.

To become a true analytic competitor, the company has to change the culture so everybody, not just the experts, thinks and acts based on facts and understands the drivers that support strategy and sustainable profitability. 

While this isn't easy, it’s possible.  I've seen it quite a few times.  It requires long term commitment from top management, adequate training, data structured to be business-intuitive and an interactive visual analytic software tool configured in an extremely user-friendly manner so all types of users can perform analytics with virtually no help from IT, analysts or even spreadsheets. 

In my experience, the organization improves its financial results through this implementation as people gradually advance to the next level, leaving buckets 1, 2 and 3 practically empty. 

Which buckets is your organization using most?

Wednesday, February 4, 2015

When Data Destroys Value

A survey by Gartner Research found that poor data quality costs companies an average $8 million per year. In a different study published by The Data Warehousing Institute more than 50% of companies experience customer dissatisfaction and cost problems due to poor data.

According to Gartner, about 80% of Business Intelligence implementations fail; while an Accenture survey of managers in Fortune 500 Companies found that 59% cannot find valuable information they need to do their jobs; 42% accidentally use the wrong information about once a week and 53% believe the information they receive is not valuable to them.

It’s pretty obvious that the main reason for the failure to meet Business Intelligence expectations is probably data quality; reinforcing the proverbial GIGO principle.

This is only getting worse considering that Aberdeen Group estimates the amount of corporate data companies need to digest is growing at 56% annual rate, coming from an average 15 unique data sources.

It’s clear that today’s Business Intelligence & Analytics software capability is light-years ahead of the semantic quality and structure of the data. After years and millions of dollars spent in BI deployment, folks in areas like Marketing, Sales or Strategic Management feel like they are drowning in an ocean of data and yet thirsty for the Strategic Knowledge they need to grow the business.

It’s common to see state of the art Business Intelligence applications that cannot deliver strategic analysis or direction until multiple analysts download the data into spreadsheets, manipulate, clean, fix and structure the data manually, for hours or days at a time.

Many folks recognize the transaction data has plenty of errors, at least at the customer, product line, brand, market and segment level, but they never get fixed and, even worse, they populate the BI systems.

This is a classic case of cultural miscommunication. Marketing and Sales people think this is an IT problem and expect the data warehouse analysts to fix it, while IT thinks it’s a business problem and expects Sales or Marketing to take action. The result is that data seldom gets corrected, people give-up, and running raw or bad data through sophisticated BI systems and dashboards becomes the norm.

After a while, Sales, Marketing and Management won’t trust the data, will not see the value of BI, will quit using the system and continue making decisions based on intuition.

When Analysts or Power users need a quick answer about market share, profitability or growth of a particular segment, product line, customer or competitor it takes days to go through the analyst’s manual process. This consists of running the right queries, exporting them to Microsoft Excel, manually cleansing data errors, adding look-ups from external data sources, and finally creating pivot tables to find the right answers.

Even worse, they have to go through the entire process over and over again every time they need a progress update, either the following week or at month-end, quarter-end or year-end for each one of the business units and markets they serve.  This not only is an inefficient process but fixing data based on analysts’ personal assumptions leads to undesirable multiple silos and different versions of the truth.

As a result, it’s common for people to spend a large portion of team meetings arguing whose data is correct instead of focusing on the critical issues. The numbers generated by Finance do not agree with the analysis performed by Marketing or the explanations provided by Sales. They need a single version of the truth, but different folks run different queries, made different data cleansing assumptions and customized their spreadsheets based on different metrics.

A Simple Solution that Works:

While IT may own the physical aspects of the data (storage, security, format, connectivity, etc.), the Business (e.g. Marketing & Sales) must own its strategic meaning and be committed to auditing and maintaining its high quality.

Proficient analysts who spend most of their time cleansing the data constantly after downloading it to spreadsheets every time they need a report, need to change and become Data Stewards that cleanse and structure the transaction data using external market intelligence BEFORE it enters the Data Warehouse or BI systems. They need to maintain, update, format and upload their data corrections on a weekly or monthly basis.

This small process change requires a committed partnership between IT and the Business Units but truly makes a difference because it breaks the Business Intelligence vs. Data Quality vicious cycle.

When the data makes sense to BI users they start to recognize their business models in their dashboards, develop trust in the data and become better users. They’ll soon find additional data issues and take the initiative to get them promptly corrected. In a few weeks there will be no more serious data quality issues as this process becomes a virtuous cycle where the BI utilization rate grows giving the company an analytical competitive edge.