It’s hard to believe that we are now over 30 years into data warehousing. In that time, we have seen major changes in tools to help user report on and analyse data. In the last twenty years, we have seen the evolution from reporting, ad hoc analysis and advanced analytics. Today, BI/Analytics is a mature market with self-service BI and visual analysis standards in most organisations with self-service data preparation also widely deployed. To many, most BI/analytics tools look the same at least in vendor marketing collateral and in sales presentations with almost every vendor having adopted a land and expand strategy, selling self-service BI into business units and growing their footprint from there.
It is not surprising, therefore, to see price-performance now high in the selection criteria for those looking to buy BI software, suggesting that it is becoming a commodity market, when if you were to look closely at BI tools, it shouldn’t be. A good example here would be Qlik with it’s differentiating associative engine, which for a long time has been able to offer context awareness based on something called logical inference. This means that every time a user clicks one or more values in a field, or performs a search, the tool immediately calculates what distinct values in all related tables qualify as being relevant to the data selection, i.e. it logically infers a new context. If you were in a crowded room of people who are all standing and holding a ticket with a number on it and the first digit of the number is selected and read out, then this is a bit like saying “would everyone who does not have that first digit on their ticket please sit down.” It is about relevance. A unique benefit of this is that the tool knows about data a user selected, but which was then excluded due to a subsequent selection. As a result, it can highlight insights that a user may be totally unaware of by allowing them to also see data they did not select.
Nevertheless, a commodity market can often mean that such capabilities can be overlooked and so vendors are under pressure to further innovate to come out with new features to give them more of a differential. In that sense, the arrival of artificial intelligence (AI) in the last few years could not have been more timely in that it has opened up relatively new era in BI/Analytics, often referred to as augmented intelligence. This is where AI helps:
- Accelerate data preparation through things like automated data profiling and transformation recommendations
- Enable automated insight and visualisation suggestions
- Increase the number of users leveraging self-service by enabling natural language search and conversational analytics
- Improve understanding via natural language generation to explain the business impact of insights
Referring back to Qlik again, their new Cognitive Engine, added to what they already have, has enabled all this and a lot more. In particular, automatic highlighting of most relevant hidden insights is even possible, which is a very powerful and unique capability.
Would you like to join the Data Architects und Data Engineering DACH community?
Join ScaleUp 360° Smart Data Architectures & Data Engineering